Feb 18 05:47:42 crc systemd[1]: Starting Kubernetes Kubelet... Feb 18 05:47:42 crc restorecon[4702]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:42 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 18 05:47:43 crc restorecon[4702]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 18 05:47:43 crc kubenswrapper[4707]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 18 05:47:43 crc kubenswrapper[4707]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 18 05:47:43 crc kubenswrapper[4707]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 18 05:47:43 crc kubenswrapper[4707]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 18 05:47:43 crc kubenswrapper[4707]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 18 05:47:43 crc kubenswrapper[4707]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.773685 4707 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780920 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780949 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780955 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780961 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780968 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780973 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780980 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780988 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.780997 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781005 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781012 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781018 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781025 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781030 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781053 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781059 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781064 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781070 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781075 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781080 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781085 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781090 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781096 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781101 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781106 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781110 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781117 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781125 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781131 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781136 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781141 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781148 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781187 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781194 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781200 4707 feature_gate.go:330] unrecognized feature gate: Example Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781205 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781210 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781215 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781220 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781226 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781233 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781238 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781243 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781248 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781253 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781258 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781263 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781268 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781274 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781280 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781299 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781306 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781311 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781316 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781321 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781329 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781335 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781340 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781346 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781351 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781357 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781362 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781367 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781372 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781377 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781383 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781388 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781392 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781398 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781403 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.781409 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.782965 4707 flags.go:64] FLAG: --address="0.0.0.0" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.782985 4707 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783004 4707 flags.go:64] FLAG: --anonymous-auth="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783012 4707 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783020 4707 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783027 4707 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783036 4707 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783044 4707 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783050 4707 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783057 4707 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783066 4707 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783074 4707 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783082 4707 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783090 4707 flags.go:64] FLAG: --cgroup-root="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783098 4707 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783116 4707 flags.go:64] FLAG: --client-ca-file="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783123 4707 flags.go:64] FLAG: --cloud-config="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783128 4707 flags.go:64] FLAG: --cloud-provider="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783134 4707 flags.go:64] FLAG: --cluster-dns="[]" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783145 4707 flags.go:64] FLAG: --cluster-domain="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783151 4707 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783157 4707 flags.go:64] FLAG: --config-dir="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783163 4707 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783170 4707 flags.go:64] FLAG: --container-log-max-files="5" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783177 4707 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783184 4707 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783190 4707 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783197 4707 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783203 4707 flags.go:64] FLAG: --contention-profiling="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783210 4707 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783216 4707 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783224 4707 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783230 4707 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783237 4707 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783244 4707 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783250 4707 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783256 4707 flags.go:64] FLAG: --enable-load-reader="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783262 4707 flags.go:64] FLAG: --enable-server="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783268 4707 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783284 4707 flags.go:64] FLAG: --event-burst="100" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783290 4707 flags.go:64] FLAG: --event-qps="50" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783296 4707 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783302 4707 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783308 4707 flags.go:64] FLAG: --eviction-hard="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783316 4707 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783322 4707 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783328 4707 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783334 4707 flags.go:64] FLAG: --eviction-soft="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783340 4707 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783346 4707 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783351 4707 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783366 4707 flags.go:64] FLAG: --experimental-mounter-path="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783372 4707 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783378 4707 flags.go:64] FLAG: --fail-swap-on="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783384 4707 flags.go:64] FLAG: --feature-gates="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783391 4707 flags.go:64] FLAG: --file-check-frequency="20s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783399 4707 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783405 4707 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783412 4707 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783419 4707 flags.go:64] FLAG: --healthz-port="10248" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783426 4707 flags.go:64] FLAG: --help="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783432 4707 flags.go:64] FLAG: --hostname-override="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783438 4707 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783445 4707 flags.go:64] FLAG: --http-check-frequency="20s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783451 4707 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783457 4707 flags.go:64] FLAG: --image-credential-provider-config="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783462 4707 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783469 4707 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783474 4707 flags.go:64] FLAG: --image-service-endpoint="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783480 4707 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783486 4707 flags.go:64] FLAG: --kube-api-burst="100" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783492 4707 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783498 4707 flags.go:64] FLAG: --kube-api-qps="50" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783504 4707 flags.go:64] FLAG: --kube-reserved="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783511 4707 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783516 4707 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783522 4707 flags.go:64] FLAG: --kubelet-cgroups="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783529 4707 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783535 4707 flags.go:64] FLAG: --lock-file="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783541 4707 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783547 4707 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783553 4707 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783562 4707 flags.go:64] FLAG: --log-json-split-stream="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783568 4707 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783574 4707 flags.go:64] FLAG: --log-text-split-stream="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783580 4707 flags.go:64] FLAG: --logging-format="text" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783585 4707 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783600 4707 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783606 4707 flags.go:64] FLAG: --manifest-url="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783612 4707 flags.go:64] FLAG: --manifest-url-header="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783620 4707 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783626 4707 flags.go:64] FLAG: --max-open-files="1000000" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783633 4707 flags.go:64] FLAG: --max-pods="110" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783639 4707 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783646 4707 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783652 4707 flags.go:64] FLAG: --memory-manager-policy="None" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783658 4707 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783664 4707 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783670 4707 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783676 4707 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783690 4707 flags.go:64] FLAG: --node-status-max-images="50" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783696 4707 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783702 4707 flags.go:64] FLAG: --oom-score-adj="-999" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783708 4707 flags.go:64] FLAG: --pod-cidr="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783714 4707 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783728 4707 flags.go:64] FLAG: --pod-manifest-path="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783734 4707 flags.go:64] FLAG: --pod-max-pids="-1" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783740 4707 flags.go:64] FLAG: --pods-per-core="0" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783746 4707 flags.go:64] FLAG: --port="10250" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783753 4707 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783758 4707 flags.go:64] FLAG: --provider-id="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783764 4707 flags.go:64] FLAG: --qos-reserved="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783770 4707 flags.go:64] FLAG: --read-only-port="10255" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783776 4707 flags.go:64] FLAG: --register-node="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783782 4707 flags.go:64] FLAG: --register-schedulable="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783788 4707 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783826 4707 flags.go:64] FLAG: --registry-burst="10" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783832 4707 flags.go:64] FLAG: --registry-qps="5" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783839 4707 flags.go:64] FLAG: --reserved-cpus="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783844 4707 flags.go:64] FLAG: --reserved-memory="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783852 4707 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783858 4707 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783864 4707 flags.go:64] FLAG: --rotate-certificates="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783879 4707 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783885 4707 flags.go:64] FLAG: --runonce="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783891 4707 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783905 4707 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783912 4707 flags.go:64] FLAG: --seccomp-default="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783923 4707 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783929 4707 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783936 4707 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783942 4707 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783948 4707 flags.go:64] FLAG: --storage-driver-password="root" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783954 4707 flags.go:64] FLAG: --storage-driver-secure="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783960 4707 flags.go:64] FLAG: --storage-driver-table="stats" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783966 4707 flags.go:64] FLAG: --storage-driver-user="root" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783972 4707 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783978 4707 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783984 4707 flags.go:64] FLAG: --system-cgroups="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783990 4707 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.783999 4707 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784005 4707 flags.go:64] FLAG: --tls-cert-file="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784011 4707 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784024 4707 flags.go:64] FLAG: --tls-min-version="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784032 4707 flags.go:64] FLAG: --tls-private-key-file="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784039 4707 flags.go:64] FLAG: --topology-manager-policy="none" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784047 4707 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784054 4707 flags.go:64] FLAG: --topology-manager-scope="container" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784061 4707 flags.go:64] FLAG: --v="2" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784069 4707 flags.go:64] FLAG: --version="false" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784077 4707 flags.go:64] FLAG: --vmodule="" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784084 4707 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784090 4707 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784286 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784293 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784298 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784304 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784311 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784318 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784332 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784342 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784347 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784353 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784358 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784363 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784369 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784374 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784379 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784384 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784389 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784394 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784399 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784404 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784409 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784415 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784421 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784427 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784432 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784438 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784443 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784450 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784455 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784461 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784466 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784472 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784477 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784483 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784488 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784493 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784499 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784506 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784511 4707 feature_gate.go:330] unrecognized feature gate: Example Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784521 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784527 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784532 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784546 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784552 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784559 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784565 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784571 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784577 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784583 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784588 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784595 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784600 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784606 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784611 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784616 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784621 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784627 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784632 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784637 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784642 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784647 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784652 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784657 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784662 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784667 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784672 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784678 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784683 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784690 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784697 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.784704 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.784718 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.800508 4707 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.800562 4707 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800697 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800711 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800721 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800732 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800743 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800753 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800763 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800773 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800782 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800823 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800836 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800847 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800857 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800868 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800879 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800888 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800917 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800925 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800934 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800955 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800964 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800972 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800981 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.800990 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801000 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801009 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801020 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801034 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801044 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801052 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801062 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801074 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801085 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801096 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801107 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801119 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801174 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801184 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801193 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801202 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801211 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801219 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801228 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801236 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801244 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801253 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801261 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801270 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801278 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801287 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801295 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801305 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801314 4707 feature_gate.go:330] unrecognized feature gate: Example Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801323 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801331 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801340 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801349 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801360 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801372 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801381 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801390 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801399 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801407 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801416 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801426 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801434 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801444 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801452 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801460 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801469 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801477 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.801492 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801733 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801746 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801759 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801770 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801779 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801792 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801838 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801852 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801863 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801873 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801881 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801890 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801898 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801906 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801915 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801923 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801932 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801940 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801975 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801985 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.801994 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802003 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802011 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802020 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802028 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802036 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802045 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802053 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802061 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802072 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802082 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802090 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802099 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802107 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802116 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802125 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802133 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802144 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802152 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802161 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802169 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802178 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802186 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802194 4707 feature_gate.go:330] unrecognized feature gate: Example Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802202 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802210 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802219 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802227 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802236 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802244 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802253 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802263 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802271 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802280 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802289 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802297 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802306 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802318 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802329 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802339 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802349 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802357 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802367 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802376 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802384 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802392 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802400 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802409 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802417 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802426 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.802434 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.802448 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.803542 4707 server.go:940] "Client rotation is on, will bootstrap in background" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.809985 4707 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.810122 4707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.811856 4707 server.go:997] "Starting client certificate rotation" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.811906 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.812140 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-29 21:05:54.261734132 +0000 UTC Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.812278 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.840605 4707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 18 05:47:43 crc kubenswrapper[4707]: E0218 05:47:43.844040 4707 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.844304 4707 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.872136 4707 log.go:25] "Validated CRI v1 runtime API" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.915127 4707 log.go:25] "Validated CRI v1 image API" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.918638 4707 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.923924 4707 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-18-05-43-08-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.923973 4707 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.958318 4707 manager.go:217] Machine: {Timestamp:2026-02-18 05:47:43.953344047 +0000 UTC m=+0.601303251 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8379f7c6-0ce6-4853-8b13-ca6ce9ed6364 BootID:785125e5-7e83-4d00-acfc-a97f7463ff42 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:53:f9:dd Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:53:f9:dd Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:48:bc:96 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:dd:63:45 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bd:91:c8 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ca:53:8e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:8a:4e:74:44:17 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:62:94:38:aa:ae:fe Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.959040 4707 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.959447 4707 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.960153 4707 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.960368 4707 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.960409 4707 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.960688 4707 topology_manager.go:138] "Creating topology manager with none policy" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.960699 4707 container_manager_linux.go:303] "Creating device plugin manager" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.961247 4707 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.961279 4707 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.961854 4707 state_mem.go:36] "Initialized new in-memory state store" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.961950 4707 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.965652 4707 kubelet.go:418] "Attempting to sync node with API server" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.965678 4707 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.965703 4707 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.965718 4707 kubelet.go:324] "Adding apiserver pod source" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.965730 4707 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.970144 4707 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.971068 4707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.971416 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.971515 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:43 crc kubenswrapper[4707]: E0218 05:47:43.971524 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:43 crc kubenswrapper[4707]: E0218 05:47:43.971611 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.974024 4707 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975828 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975876 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975894 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975910 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975933 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975947 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975963 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.975987 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.976015 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.976031 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.976062 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.976078 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.977228 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.978167 4707 server.go:1280] "Started kubelet" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.979127 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.979247 4707 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.979446 4707 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.979868 4707 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 18 05:47:43 crc systemd[1]: Started Kubernetes Kubelet. Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.982954 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.983164 4707 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 18 05:47:43 crc kubenswrapper[4707]: E0218 05:47:43.983448 4707 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.983478 4707 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.983522 4707 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.983541 4707 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.983256 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:31:12.412722776 +0000 UTC Feb 18 05:47:43 crc kubenswrapper[4707]: E0218 05:47:43.984224 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="200ms" Feb 18 05:47:43 crc kubenswrapper[4707]: W0218 05:47:43.984250 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:43 crc kubenswrapper[4707]: E0218 05:47:43.984328 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.984360 4707 server.go:460] "Adding debug handlers to kubelet server" Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.988955 4707 factory.go:153] Registering CRI-O factory Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.989024 4707 factory.go:221] Registration of the crio container factory successfully Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.989125 4707 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.989143 4707 factory.go:55] Registering systemd factory Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.989216 4707 factory.go:221] Registration of the systemd container factory successfully Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.989251 4707 factory.go:103] Registering Raw factory Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.989275 4707 manager.go:1196] Started watching for new ooms in manager Feb 18 05:47:43 crc kubenswrapper[4707]: I0218 05:47:43.994651 4707 manager.go:319] Starting recovery of all containers Feb 18 05:47:43 crc kubenswrapper[4707]: E0218 05:47:43.994406 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18954126f2cb28c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 05:47:43.978121417 +0000 UTC m=+0.626080591,LastTimestamp:2026-02-18 05:47:43.978121417 +0000 UTC m=+0.626080591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005301 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005387 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005417 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005437 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005456 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005476 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005498 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005519 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005546 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005565 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005585 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005640 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005660 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005683 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005703 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005723 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005742 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005760 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005782 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005836 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005863 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005887 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005910 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005934 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.005964 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006061 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006090 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006112 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006131 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006150 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006169 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006190 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006208 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006230 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006249 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006267 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006287 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006308 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006328 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006350 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006371 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006391 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006412 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006432 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006452 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006471 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006489 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006509 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006535 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006565 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006586 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006605 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006632 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006654 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.006676 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011056 4707 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011125 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011153 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011173 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011194 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011218 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011237 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011257 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011276 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011296 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011322 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011342 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011361 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011379 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011398 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011418 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011437 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011475 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011495 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011517 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011542 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011568 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011592 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011611 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011642 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011662 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011683 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011706 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011726 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011747 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011768 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011787 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011838 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011860 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011880 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011902 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011922 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011942 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011966 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.011986 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012005 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012025 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012114 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012137 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012157 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012178 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012199 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012219 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012240 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012263 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012293 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012351 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012375 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012398 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012420 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012441 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012463 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012485 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012513 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012543 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012571 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012592 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012612 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012632 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012651 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012672 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012693 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012714 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012733 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012828 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012852 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012895 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012915 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012936 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012958 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012977 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.012996 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013018 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013036 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013054 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013073 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013095 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013114 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013132 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013150 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013168 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013187 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013206 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013224 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013243 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013264 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013291 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013314 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013332 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013352 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013372 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013391 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013412 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013435 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013455 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013475 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013495 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013519 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013549 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013574 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013595 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013614 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013634 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013657 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013678 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013697 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013719 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013739 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013764 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.013783 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014010 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014039 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014058 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014109 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014134 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014178 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014199 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014221 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014275 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014297 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014317 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014336 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014355 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014373 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014394 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014411 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014428 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014449 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014467 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014485 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014506 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014532 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014559 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014586 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014605 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014625 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014649 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014667 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014686 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014704 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014728 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014747 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014767 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014785 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014833 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014853 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014873 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014893 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014909 4707 reconstruct.go:97] "Volume reconstruction finished" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.014924 4707 reconciler.go:26] "Reconciler: start to sync state" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.029469 4707 manager.go:324] Recovery completed Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.045718 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.047949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.048003 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.048018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.049225 4707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.049524 4707 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.049543 4707 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.049566 4707 state_mem.go:36] "Initialized new in-memory state store" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.051695 4707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.051759 4707 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.051827 4707 kubelet.go:2335] "Starting kubelet main sync loop" Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.051902 4707 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 18 05:47:44 crc kubenswrapper[4707]: W0218 05:47:44.052883 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.052994 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.070638 4707 policy_none.go:49] "None policy: Start" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.073290 4707 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.073329 4707 state_mem.go:35] "Initializing new in-memory state store" Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.083892 4707 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.127829 4707 manager.go:334] "Starting Device Plugin manager" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.128325 4707 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.128412 4707 server.go:79] "Starting device plugin registration server" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.128953 4707 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.129073 4707 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.129643 4707 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.129855 4707 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.129939 4707 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.144161 4707 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.152597 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.152768 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.154599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.154652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.154661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.154905 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156080 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156326 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156646 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.156673 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157525 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157700 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.157725 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.158628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.158774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.158868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.158885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.158937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.158950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.159100 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.159321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.159405 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160250 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.160740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.161153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.161225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.161252 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.185539 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="400ms" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217469 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.217987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.218020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.229300 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.230540 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.230593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.230613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.230645 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.231324 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320828 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320882 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.320764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.321858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.322219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.322348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.322425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.322477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.322488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.432416 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.434249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.434298 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.434311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.434345 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.434973 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.485983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.503322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.521346 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.529320 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.532321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 18 05:47:44 crc kubenswrapper[4707]: W0218 05:47:44.544329 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f5a9412be672cb5881274b2f45df54fdc23205458158f8ae84178a961097ddca WatchSource:0}: Error finding container f5a9412be672cb5881274b2f45df54fdc23205458158f8ae84178a961097ddca: Status 404 returned error can't find the container with id f5a9412be672cb5881274b2f45df54fdc23205458158f8ae84178a961097ddca Feb 18 05:47:44 crc kubenswrapper[4707]: W0218 05:47:44.545908 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8403390fdc01ff6ce7d54f611c42c0af0a987e10243950d0f2612f1675db4ed7 WatchSource:0}: Error finding container 8403390fdc01ff6ce7d54f611c42c0af0a987e10243950d0f2612f1675db4ed7: Status 404 returned error can't find the container with id 8403390fdc01ff6ce7d54f611c42c0af0a987e10243950d0f2612f1675db4ed7 Feb 18 05:47:44 crc kubenswrapper[4707]: W0218 05:47:44.559292 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a3b2f80ea5c18577ae4c1fbdd4c3240336206580ef1e1de1c28f1450751743d1 WatchSource:0}: Error finding container a3b2f80ea5c18577ae4c1fbdd4c3240336206580ef1e1de1c28f1450751743d1: Status 404 returned error can't find the container with id a3b2f80ea5c18577ae4c1fbdd4c3240336206580ef1e1de1c28f1450751743d1 Feb 18 05:47:44 crc kubenswrapper[4707]: W0218 05:47:44.563313 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2b3ba8b24ac4d9552a7ca976a6ec9a56feddc66d9814284256c69f78a30f0b54 WatchSource:0}: Error finding container 2b3ba8b24ac4d9552a7ca976a6ec9a56feddc66d9814284256c69f78a30f0b54: Status 404 returned error can't find the container with id 2b3ba8b24ac4d9552a7ca976a6ec9a56feddc66d9814284256c69f78a30f0b54 Feb 18 05:47:44 crc kubenswrapper[4707]: W0218 05:47:44.563900 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-967da4d12f767fe0fb85d897b28e5ef11b3681f5012b7f7ef4763a9d12732831 WatchSource:0}: Error finding container 967da4d12f767fe0fb85d897b28e5ef11b3681f5012b7f7ef4763a9d12732831: Status 404 returned error can't find the container with id 967da4d12f767fe0fb85d897b28e5ef11b3681f5012b7f7ef4763a9d12732831 Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.586381 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="800ms" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.835544 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.837657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.837680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.837687 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.837715 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.838059 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Feb 18 05:47:44 crc kubenswrapper[4707]: W0218 05:47:44.958290 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:44 crc kubenswrapper[4707]: E0218 05:47:44.958457 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.980260 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:44 crc kubenswrapper[4707]: I0218 05:47:44.984215 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 17:15:03.223106005 +0000 UTC Feb 18 05:47:45 crc kubenswrapper[4707]: W0218 05:47:45.022203 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:45 crc kubenswrapper[4707]: E0218 05:47:45.022333 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.058180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"967da4d12f767fe0fb85d897b28e5ef11b3681f5012b7f7ef4763a9d12732831"} Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.060168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2b3ba8b24ac4d9552a7ca976a6ec9a56feddc66d9814284256c69f78a30f0b54"} Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.061469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a3b2f80ea5c18577ae4c1fbdd4c3240336206580ef1e1de1c28f1450751743d1"} Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.062558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f5a9412be672cb5881274b2f45df54fdc23205458158f8ae84178a961097ddca"} Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.064286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8403390fdc01ff6ce7d54f611c42c0af0a987e10243950d0f2612f1675db4ed7"} Feb 18 05:47:45 crc kubenswrapper[4707]: W0218 05:47:45.172638 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:45 crc kubenswrapper[4707]: E0218 05:47:45.172761 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:45 crc kubenswrapper[4707]: E0218 05:47:45.387650 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="1.6s" Feb 18 05:47:45 crc kubenswrapper[4707]: W0218 05:47:45.542458 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:45 crc kubenswrapper[4707]: E0218 05:47:45.543226 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.638317 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.641418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.641486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.641508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.641564 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:47:45 crc kubenswrapper[4707]: E0218 05:47:45.642356 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.879244 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 18 05:47:45 crc kubenswrapper[4707]: E0218 05:47:45.881600 4707 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.980549 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:45 crc kubenswrapper[4707]: I0218 05:47:45.984538 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:27:09.699519584 +0000 UTC Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.069959 4707 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="26834895f33fa4c479beb5f5d6fe88b8baa70b95f26de75df8a7355cacdb12e6" exitCode=0 Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.070126 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"26834895f33fa4c479beb5f5d6fe88b8baa70b95f26de75df8a7355cacdb12e6"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.070245 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.072557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.072601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.072614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.075638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0115439de27bcbc6ef2fb4f8cdff46c1fe683fb9ef9a3d0e2881559fe42be05c"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.075688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5df223e562fb459695296fba0ea109910acb08c390c533d5cef4aee4514b892b"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.075701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a2b582707cfde485cc1f3a965f343c1702adf31e1f38b2bf4ee3667dace57ffe"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.075715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.075843 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.076868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.076917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.076937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.081247 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae" exitCode=0 Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.081371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.081458 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.082937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.082983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.082997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.084483 4707 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="c456a06fe10f43eec51369d2b514208e3e2d161d013d92e406ad3bfd40ab7b36" exitCode=0 Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.084554 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.084546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"c456a06fe10f43eec51369d2b514208e3e2d161d013d92e406ad3bfd40ab7b36"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.085493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.085538 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.085557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.085571 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.086159 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a" exitCode=0 Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.086210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a"} Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.086384 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.087131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.087156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.087166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.087196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.087232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.087245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.158483 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.288024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:46 crc kubenswrapper[4707]: W0218 05:47:46.850531 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:46 crc kubenswrapper[4707]: E0218 05:47:46.850629 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.980460 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:46 crc kubenswrapper[4707]: I0218 05:47:46.985008 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 15:07:14.315888575 +0000 UTC Feb 18 05:47:46 crc kubenswrapper[4707]: E0218 05:47:46.988554 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="3.2s" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.092097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"669ad93e73ae22396ec9d6179dab9549281fd7fc8b910f544f827f46991f586c"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.092139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c0b23cd88a0e2578bd6ea51f25561f162f68b70c5cda8974d5b41908fb788def"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.092149 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ca59390914af671d054791822f17c531d0649f08e4e897a17d0e52d0ac74a0bd"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.092240 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.093896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.093948 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.093963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.099834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.099895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.099917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.101637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7f62271c1ef8bf85b32eecc6eac464a8de0d7a3fe903dddaab54872582d13d41"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.101760 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.102878 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.102907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.102920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.105190 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594" exitCode=0 Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.105305 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.105855 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.106203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594"} Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.106748 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.106780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.106843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.107533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.107559 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.107571 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:47 crc kubenswrapper[4707]: W0218 05:47:47.178408 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:47 crc kubenswrapper[4707]: E0218 05:47:47.178525 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:47 crc kubenswrapper[4707]: E0218 05:47:47.181237 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18954126f2cb28c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 05:47:43.978121417 +0000 UTC m=+0.626080591,LastTimestamp:2026-02-18 05:47:43.978121417 +0000 UTC m=+0.626080591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.243441 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.244596 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.244637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.244646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.244669 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:47:47 crc kubenswrapper[4707]: E0218 05:47:47.245134 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Feb 18 05:47:47 crc kubenswrapper[4707]: W0218 05:47:47.274469 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Feb 18 05:47:47 crc kubenswrapper[4707]: E0218 05:47:47.274564 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Feb 18 05:47:47 crc kubenswrapper[4707]: I0218 05:47:47.985882 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 18:00:48.090207941 +0000 UTC Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.114098 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52" exitCode=0 Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.114224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52"} Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.114418 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.116049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.116139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.116159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.121003 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.121088 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.121475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7"} Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.121565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61"} Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.121780 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.121947 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.122748 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.123112 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.123342 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.123418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.123441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.124740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.124827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.124848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.125438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.125498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.125520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.125501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.125593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.125622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.791475 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:48 crc kubenswrapper[4707]: I0218 05:47:48.986416 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:05:43.381751949 +0000 UTC Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.131313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"95cc465a09cf3ffc292e6f5042bdcb8029b950af0dca4ca8e23bb553ae09f734"} Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.131440 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.131543 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.131673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1ce5a01f9f1d2c88e96636d93dcfc4bcf5c139c5c703c6583f02dd5fd4ba0fe7"} Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.131730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bb131874e79fd4df50fec3942a525e8f155fb0fee3a3a879b7d7b97f800caa94"} Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.133292 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.133365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.133396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.501962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.502193 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.502272 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.504233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.504329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.504349 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:49 crc kubenswrapper[4707]: I0218 05:47:49.986909 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 10:21:08.519269511 +0000 UTC Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.142377 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.142382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3687e40ec33f7a3a2355e0becbc12cf9c21ac8cfe05dbdbed91c4a881cc59ec9"} Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.142451 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.142455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ad8c12313c0ae6dea2ac2553caf49f4a2e459287b6271098413c2f4515674c2b"} Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.142500 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.143993 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.144043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.144061 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.144361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.144416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.144441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.151671 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.374971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.446283 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.448113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.448168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.448183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.448217 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:47:50 crc kubenswrapper[4707]: I0218 05:47:50.987950 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 18:53:24.091507058 +0000 UTC Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.145279 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.145279 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.152191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.152253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.152267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.153137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.153245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.153273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.664692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.741685 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 18 05:47:51 crc kubenswrapper[4707]: I0218 05:47:51.989103 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 15:06:41.847805521 +0000 UTC Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.148023 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.148065 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.149125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.149156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.149165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.149353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.149397 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.149416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:52 crc kubenswrapper[4707]: I0218 05:47:52.990242 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 06:21:17.690606476 +0000 UTC Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.044942 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.045170 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.045244 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.047117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.047169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.047180 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.429231 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.429586 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.431572 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.431611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.431623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.613787 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.614201 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.616054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.616130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.616157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.630028 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.630219 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.631467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.631532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.631551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:47:53 crc kubenswrapper[4707]: I0218 05:47:53.990998 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 09:27:58.473506088 +0000 UTC Feb 18 05:47:54 crc kubenswrapper[4707]: E0218 05:47:54.144593 4707 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 18 05:47:54 crc kubenswrapper[4707]: I0218 05:47:54.991181 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:37:26.036707633 +0000 UTC Feb 18 05:47:55 crc kubenswrapper[4707]: I0218 05:47:55.992087 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:56:10.645490769 +0000 UTC Feb 18 05:47:56 crc kubenswrapper[4707]: I0218 05:47:56.045542 4707 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 18 05:47:56 crc kubenswrapper[4707]: I0218 05:47:56.045662 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 18 05:47:56 crc kubenswrapper[4707]: I0218 05:47:56.993115 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 00:22:39.119234444 +0000 UTC Feb 18 05:47:57 crc kubenswrapper[4707]: I0218 05:47:57.046606 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 18 05:47:57 crc kubenswrapper[4707]: I0218 05:47:57.046798 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 18 05:47:57 crc kubenswrapper[4707]: I0218 05:47:57.981283 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 18 05:47:57 crc kubenswrapper[4707]: I0218 05:47:57.993624 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 07:54:36.143255729 +0000 UTC Feb 18 05:47:58 crc kubenswrapper[4707]: W0218 05:47:58.206021 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 18 05:47:58 crc kubenswrapper[4707]: I0218 05:47:58.206167 4707 trace.go:236] Trace[866139945]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 05:47:48.204) (total time: 10001ms): Feb 18 05:47:58 crc kubenswrapper[4707]: Trace[866139945]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:47:58.206) Feb 18 05:47:58 crc kubenswrapper[4707]: Trace[866139945]: [10.001494502s] [10.001494502s] END Feb 18 05:47:58 crc kubenswrapper[4707]: E0218 05:47:58.206204 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 18 05:47:58 crc kubenswrapper[4707]: I0218 05:47:58.771074 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 18 05:47:58 crc kubenswrapper[4707]: I0218 05:47:58.771202 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 18 05:47:58 crc kubenswrapper[4707]: I0218 05:47:58.777922 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 18 05:47:58 crc kubenswrapper[4707]: I0218 05:47:58.778389 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 18 05:47:58 crc kubenswrapper[4707]: I0218 05:47:58.994445 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 09:41:17.754890012 +0000 UTC Feb 18 05:47:59 crc kubenswrapper[4707]: I0218 05:47:59.995217 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 12:43:34.16712555 +0000 UTC Feb 18 05:48:00 crc kubenswrapper[4707]: I0218 05:48:00.996157 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 05:51:16.89227912 +0000 UTC Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.675060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.675332 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.677341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.677441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.677463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.684943 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.733431 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.977507 4707 apiserver.go:52] "Watching apiserver" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.985222 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.985687 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.986297 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.986531 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.986665 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.986718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:01 crc kubenswrapper[4707]: E0218 05:48:01.986838 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.986889 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:01 crc kubenswrapper[4707]: E0218 05:48:01.987177 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.987971 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:01 crc kubenswrapper[4707]: E0218 05:48:01.988037 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.989380 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.989910 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.990346 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.990388 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.990380 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.990797 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.991563 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.992547 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.993295 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 05:48:01 crc kubenswrapper[4707]: I0218 05:48:01.996266 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:22:13.474863184 +0000 UTC Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.061751 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.075977 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.084663 4707 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.093126 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.111370 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.125960 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.140536 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.153042 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.166221 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.181437 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.200972 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:48:02 crc kubenswrapper[4707]: I0218 05:48:02.997271 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 01:05:13.006094572 +0000 UTC Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.052148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.052280 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.184115 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.622778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.653131 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.657427 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470ed6e7-c010-42bb-8889-a7a460f92c43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.672409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.677815 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.691265 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.694439 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.695527 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.708776 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.724621 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.745342 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.768829 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.772488 4707 trace.go:236] Trace[1946417396]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 05:47:51.666) (total time: 12106ms): Feb 18 05:48:03 crc kubenswrapper[4707]: Trace[1946417396]: ---"Objects listed" error: 12106ms (05:48:03.772) Feb 18 05:48:03 crc kubenswrapper[4707]: Trace[1946417396]: [12.106121893s] [12.106121893s] END Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.772532 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.773282 4707 trace.go:236] Trace[515073217]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 05:47:52.172) (total time: 11600ms): Feb 18 05:48:03 crc kubenswrapper[4707]: Trace[515073217]: ---"Objects listed" error: 11600ms (05:48:03.773) Feb 18 05:48:03 crc kubenswrapper[4707]: Trace[515073217]: [11.600854923s] [11.600854923s] END Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.773320 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.773644 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.775177 4707 trace.go:236] Trace[1751922037]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 05:47:51.704) (total time: 12070ms): Feb 18 05:48:03 crc kubenswrapper[4707]: Trace[1751922037]: ---"Objects listed" error: 12070ms (05:48:03.774) Feb 18 05:48:03 crc kubenswrapper[4707]: Trace[1751922037]: [12.070907909s] [12.070907909s] END Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.775249 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.776736 4707 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.778364 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.779541 4707 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.788036 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470ed6e7-c010-42bb-8889-a7a460f92c43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.807589 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.814278 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cfb79b-ea09-4ff2-8401-b2341d99931c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce5a01f9f1d2c88e96636d93dcfc4bcf5c139c5c703c6583f02dd5fd4ba0fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cc465a09cf3ffc292e6f5042bdcb8029b950af0dca4ca8e23bb553ae09f734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad8c12313c0ae6dea2ac2553caf49f4a2e459287b6271098413c2f4515674c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3687e40ec33f7a3a2355e0becbc12cf9c21ac8cfe05dbdbed91c4a881cc59ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb131874e79fd4df50fec3942a525e8f155fb0fee3a3a879b7d7b97f800caa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.830401 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.848147 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.863344 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.879556 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaff3d88-738e-4104-817c-652ac5d188dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2b582707cfde485cc1f3a965f343c1702adf31e1f38b2bf4ee3667dace57ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df223e562fb459695296fba0ea109910acb08c390c533d5cef4aee4514b892b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0115439de27bcbc6ef2fb4f8cdff46c1fe683fb9ef9a3d0e2881559fe42be05c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880602 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.880941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882894 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882985 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884922 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.881967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882765 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.882985 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.883911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.884373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.886556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.886600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887569 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.887999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888922 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889657 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890040 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890762 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890775 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890786 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890820 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890835 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890848 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890861 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890874 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890887 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890901 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890915 4707 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890929 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890942 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890955 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890969 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890982 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890995 4707 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891008 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891020 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891033 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891049 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891062 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891077 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891092 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891106 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891118 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891131 4707 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891144 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891156 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891168 4707 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891184 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891197 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891209 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891223 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.908295 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.888856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.889493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.890870 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.891968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.892213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.892712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.892900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.893088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.893178 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.893293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.893738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.893999 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:04.393968642 +0000 UTC m=+21.041927786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.910155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.910252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.910328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.910447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.910622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.910610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.894201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.894586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.894604 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.894922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.895265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.895618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.895484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.896440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.896864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.897222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.897681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.897705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.897891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.898250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.898281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.898435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.898694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.898409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.898903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.899207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.899306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.899338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.900496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.900691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.901056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.901241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.901274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.901395 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.901911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.902001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.906158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.907944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.908334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.908720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.908712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.909111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.909211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.909285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.909355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.909662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.909688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.910862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.911321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.885737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.911692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.911748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.911678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.912070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.912316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.912586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.912622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.912647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.912696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.912745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.913085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.913067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.913889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.914224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.914321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.914769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.914871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.915737 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.915908 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.915625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.916063 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.916282 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:04.416241338 +0000 UTC m=+21.064200502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.916823 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.917737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.917825 4707 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.918625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.918691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.918741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.918881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.918935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.919112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.919484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.920025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.928583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.929570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.930235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.930633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.931175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.931895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.932546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.932996 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.933105 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:04.433080892 +0000 UTC m=+21.081040036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.933668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.935428 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.936918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.937004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.935940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.937334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.936333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.937148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.937600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.937633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.937659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.938218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.938231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.940086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.940208 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.940231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.940297 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:04.440272242 +0000 UTC m=+21.088231376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.935584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.938309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.941130 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.938540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.938628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.938660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.938682 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.939586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.940768 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.941712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.943930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.944127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.944520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.945294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.954723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.955270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.945354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.946072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.946233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.946755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.952950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.953142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.954027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.955150 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.958941 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.958959 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:03 crc kubenswrapper[4707]: E0218 05:48:03.959043 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:04.458997815 +0000 UTC m=+21.106956949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.956710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.960682 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.964552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.964892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.965844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.965822 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.969450 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.978299 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.979341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.987773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.990310 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991717 4707 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991729 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991741 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991752 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991761 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991772 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991785 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991813 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991825 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991838 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991848 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991859 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991868 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991879 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991888 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991898 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.991908 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992120 4707 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992131 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992141 4707 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992151 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992162 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992171 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992182 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992191 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992201 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992213 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992226 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992235 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992245 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992254 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992266 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992277 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992288 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992297 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992307 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992316 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992348 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992357 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992367 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992377 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992388 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992397 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992409 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992418 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992428 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992437 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992448 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992458 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992468 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992476 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992486 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992495 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992505 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992516 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992528 4707 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992538 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992548 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992559 4707 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992569 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992578 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992587 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992596 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992608 4707 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992675 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992704 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992721 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992735 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992749 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992764 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992781 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992818 4707 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992834 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992857 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992873 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992886 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992899 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992913 4707 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992928 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992943 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992957 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992971 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992984 4707 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.992998 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993012 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993028 4707 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993046 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993061 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993077 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993091 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993105 4707 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993118 4707 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993132 4707 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993145 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993159 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993179 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993192 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993207 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993221 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993235 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993251 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993267 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993283 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993295 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993310 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993323 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993340 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993354 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993366 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993379 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993393 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993406 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993420 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993434 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993447 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993460 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993474 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993486 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993503 4707 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993516 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993529 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993545 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993558 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993570 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993582 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993595 4707 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993608 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993624 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993638 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993653 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993680 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993694 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993708 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993721 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993734 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:03 crc kubenswrapper[4707]: I0218 05:48:03.993749 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993761 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993776 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993817 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993831 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993845 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993858 4707 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993872 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993885 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993897 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993909 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993922 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993934 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993948 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993962 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993975 4707 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.993989 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994003 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994016 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994029 4707 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994043 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994056 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994069 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994083 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.994253 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:03.997626 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:53:38.431700565 +0000 UTC Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.004325 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.033321 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.052728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.053037 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.053237 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.053444 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.057990 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.058710 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.060383 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaff3d88-738e-4104-817c-652ac5d188dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2b582707cfde485cc1f3a965f343c1702adf31e1f38b2bf4ee3667dace57ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df223e562fb459695296fba0ea109910acb08c390c533d5cef4aee4514b892b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0115439de27bcbc6ef2fb4f8cdff46c1fe683fb9ef9a3d0e2881559fe42be05c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.060505 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.061410 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.063513 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.064178 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.065115 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.066460 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.067397 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.068733 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.069470 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.071203 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.072264 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.072992 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.074528 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.075258 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.076077 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.076726 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.077277 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.078195 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.079542 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.080107 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.081270 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.081788 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.083011 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.083479 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.084148 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.085551 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.086207 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.087518 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.088115 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.089039 4707 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.089143 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.091142 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.092376 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.092475 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.092846 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.094877 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.095018 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.095989 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.097051 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.097840 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.098905 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.099403 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.100466 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.101348 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.102483 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.103022 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.104251 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.104734 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.104947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.104959 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.106303 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.106774 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.107747 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.108326 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.109617 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.110396 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.111014 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.119164 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470ed6e7-c010-42bb-8889-a7a460f92c43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.139343 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cfb79b-ea09-4ff2-8401-b2341d99931c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce5a01f9f1d2c88e96636d93dcfc4bcf5c139c5c703c6583f02dd5fd4ba0fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cc465a09cf3ffc292e6f5042bdcb8029b950af0dca4ca8e23bb553ae09f734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad8c12313c0ae6dea2ac2553caf49f4a2e459287b6271098413c2f4515674c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3687e40ec33f7a3a2355e0becbc12cf9c21ac8cfe05dbdbed91c4a881cc59ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb131874e79fd4df50fec3942a525e8f155fb0fee3a3a879b7d7b97f800caa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.141462 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.153458 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470ed6e7-c010-42bb-8889-a7a460f92c43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.159075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.176125 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cfb79b-ea09-4ff2-8401-b2341d99931c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce5a01f9f1d2c88e96636d93dcfc4bcf5c139c5c703c6583f02dd5fd4ba0fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cc465a09cf3ffc292e6f5042bdcb8029b950af0dca4ca8e23bb553ae09f734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad8c12313c0ae6dea2ac2553caf49f4a2e459287b6271098413c2f4515674c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3687e40ec33f7a3a2355e0becbc12cf9c21ac8cfe05dbdbed91c4a881cc59ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb131874e79fd4df50fec3942a525e8f155fb0fee3a3a879b7d7b97f800caa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.188854 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.192747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c3ef11d7438365cc11f8e20e7c666a751439bfb8a52d9e42fa61115e35ea6070"} Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.194556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"00e1e7a5700196aa341ca986ed49a5ea0fe397714d5a56b34ac0a4e4465e529f"} Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.198484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c6fb5b70dd3debcff83678e7f579c1c2957180ad03b4619efb73d99acac8a477"} Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.199784 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.211720 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.229425 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaff3d88-738e-4104-817c-652ac5d188dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2b582707cfde485cc1f3a965f343c1702adf31e1f38b2bf4ee3667dace57ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df223e562fb459695296fba0ea109910acb08c390c533d5cef4aee4514b892b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0115439de27bcbc6ef2fb4f8cdff46c1fe683fb9ef9a3d0e2881559fe42be05c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.243373 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.257634 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.270341 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.397352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.398825 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:05.398745576 +0000 UTC m=+22.046704940 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.442422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.479596 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cfb79b-ea09-4ff2-8401-b2341d99931c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce5a01f9f1d2c88e96636d93dcfc4bcf5c139c5c703c6583f02dd5fd4ba0fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cc465a09cf3ffc292e6f5042bdcb8029b950af0dca4ca8e23bb553ae09f734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad8c12313c0ae6dea2ac2553caf49f4a2e459287b6271098413c2f4515674c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3687e40ec33f7a3a2355e0becbc12cf9c21ac8cfe05dbdbed91c4a881cc59ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb131874e79fd4df50fec3942a525e8f155fb0fee3a3a879b7d7b97f800caa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.490754 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.498527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.498581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.498626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.498646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.498781 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.498812 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.498823 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.498876 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:05.498864823 +0000 UTC m=+22.146823957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.498840 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.499008 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:05.498971216 +0000 UTC m=+22.146930390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.499127 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.499185 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:05.499170181 +0000 UTC m=+22.147129355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.499170 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.499236 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.499257 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:04 crc kubenswrapper[4707]: E0218 05:48:04.499306 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:05.499292894 +0000 UTC m=+22.147252068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.506006 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.524016 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.537927 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470ed6e7-c010-42bb-8889-a7a460f92c43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.549610 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.566628 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.578493 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.589691 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaff3d88-738e-4104-817c-652ac5d188dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2b582707cfde485cc1f3a965f343c1702adf31e1f38b2bf4ee3667dace57ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df223e562fb459695296fba0ea109910acb08c390c533d5cef4aee4514b892b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0115439de27bcbc6ef2fb4f8cdff46c1fe683fb9ef9a3d0e2881559fe42be05c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 05:48:04 crc kubenswrapper[4707]: I0218 05:48:04.998206 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 19:39:01.87610244 +0000 UTC Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.053171 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.053457 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.203957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a9d0b027917e5af6f725b7352fb0a1538f324a50184e0ee42f4d6fdfe9975478"} Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.208001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0e3e2bf87253927e3b453cc94cef1bb1185f76d2ed216aec0e4b70a175a331e0"} Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.208096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"67f80d925a0cd434f1387eb0133a8259d01a344648b5b92e918dfd6cb5741cfd"} Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.214315 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.231584 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.256556 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.280355 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.303051 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470ed6e7-c010-42bb-8889-a7a460f92c43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.326191 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cfb79b-ea09-4ff2-8401-b2341d99931c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce5a01f9f1d2c88e96636d93dcfc4bcf5c139c5c703c6583f02dd5fd4ba0fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cc465a09cf3ffc292e6f5042bdcb8029b950af0dca4ca8e23bb553ae09f734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad8c12313c0ae6dea2ac2553caf49f4a2e459287b6271098413c2f4515674c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3687e40ec33f7a3a2355e0becbc12cf9c21ac8cfe05dbdbed91c4a881cc59ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb131874e79fd4df50fec3942a525e8f155fb0fee3a3a879b7d7b97f800caa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.350225 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9d0b027917e5af6f725b7352fb0a1538f324a50184e0ee42f4d6fdfe9975478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.376779 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.398313 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.408926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.409115 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:07.409084514 +0000 UTC m=+24.057043668 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.415382 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaff3d88-738e-4104-817c-652ac5d188dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2b582707cfde485cc1f3a965f343c1702adf31e1f38b2bf4ee3667dace57ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df223e562fb459695296fba0ea109910acb08c390c533d5cef4aee4514b892b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0115439de27bcbc6ef2fb4f8cdff46c1fe683fb9ef9a3d0e2881559fe42be05c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.447639 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cfb79b-ea09-4ff2-8401-b2341d99931c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce5a01f9f1d2c88e96636d93dcfc4bcf5c139c5c703c6583f02dd5fd4ba0fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cc465a09cf3ffc292e6f5042bdcb8029b950af0dca4ca8e23bb553ae09f734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad8c12313c0ae6dea2ac2553caf49f4a2e459287b6271098413c2f4515674c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3687e40ec33f7a3a2355e0becbc12cf9c21ac8cfe05dbdbed91c4a881cc59ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb131874e79fd4df50fec3942a525e8f155fb0fee3a3a879b7d7b97f800caa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d79e3438ead12a6f234e042bdba65ec5e411efe2d8a981b40ab68bf131796b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279eb44eed95563ace079b92df03d759e202a5981903dd329a04ac969a8c5594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db83073ec965f6afe4e861a2228212adc2799c7619559d692ccec6742c8ade52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.470143 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.501124 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e3e2bf87253927e3b453cc94cef1bb1185f76d2ed216aec0e4b70a175a331e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f80d925a0cd434f1387eb0133a8259d01a344648b5b92e918dfd6cb5741cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.509552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.509652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.509712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.509857 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.510007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510045 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510223 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.509905 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510492 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:07.510464923 +0000 UTC m=+24.158424097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.509933 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510569 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510583 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510152 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510653 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:07.510633018 +0000 UTC m=+24.158592152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510903 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:07.510843723 +0000 UTC m=+24.158802867 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:05 crc kubenswrapper[4707]: E0218 05:48:05.510956 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:07.510926395 +0000 UTC m=+24.158885679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.532584 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.554360 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470ed6e7-c010-42bb-8889-a7a460f92c43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T05:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.575887 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9d0b027917e5af6f725b7352fb0a1538f324a50184e0ee42f4d6fdfe9975478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:48:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.596816 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.615970 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.634931 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaff3d88-738e-4104-817c-652ac5d188dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T05:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2b582707cfde485cc1f3a965f343c1702adf31e1f38b2bf4ee3667dace57ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df223e562fb459695296fba0ea109910acb08c390c533d5cef4aee4514b892b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0115439de27bcbc6ef2fb4f8cdff46c1fe683fb9ef9a3d0e2881559fe42be05c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T05:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T05:47:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T05:48:05Z is after 2025-08-24T17:21:41Z" Feb 18 05:48:05 crc kubenswrapper[4707]: I0218 05:48:05.998964 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 05:13:37.494712231 +0000 UTC Feb 18 05:48:06 crc kubenswrapper[4707]: I0218 05:48:06.053032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:06 crc kubenswrapper[4707]: I0218 05:48:06.053140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:06 crc kubenswrapper[4707]: E0218 05:48:06.053212 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:06 crc kubenswrapper[4707]: E0218 05:48:06.053321 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:06 crc kubenswrapper[4707]: I0218 05:48:06.999592 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 05:46:49.783886156 +0000 UTC Feb 18 05:48:07 crc kubenswrapper[4707]: I0218 05:48:07.052417 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.052665 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:07 crc kubenswrapper[4707]: I0218 05:48:07.430921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.431160 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:11.431119535 +0000 UTC m=+28.079078709 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:07 crc kubenswrapper[4707]: I0218 05:48:07.531632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:07 crc kubenswrapper[4707]: I0218 05:48:07.531688 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:07 crc kubenswrapper[4707]: I0218 05:48:07.531724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:07 crc kubenswrapper[4707]: I0218 05:48:07.531755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.531942 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.531965 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.531980 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.531982 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.532046 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:11.532028383 +0000 UTC m=+28.179987527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.531997 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.532183 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.532226 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.531992 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.532120 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:11.532082094 +0000 UTC m=+28.180041258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.532456 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:11.532416053 +0000 UTC m=+28.180375227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:07 crc kubenswrapper[4707]: E0218 05:48:07.532494 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:11.532480424 +0000 UTC m=+28.180439598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.000612 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 11:51:40.121232304 +0000 UTC Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.052241 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.052256 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:08 crc kubenswrapper[4707]: E0218 05:48:08.052438 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:08 crc kubenswrapper[4707]: E0218 05:48:08.052593 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.699316 4707 csr.go:261] certificate signing request csr-8ghr4 is approved, waiting to be issued Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.714845 4707 csr.go:257] certificate signing request csr-8ghr4 is issued Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.914421 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8vb52"] Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.914720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.916767 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-g4w9k"] Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.916950 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.917309 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.917600 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.917943 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 05:48:08 crc kubenswrapper[4707]: W0218 05:48:08.918116 4707 reflector.go:561] object-"openshift-image-registry"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Feb 18 05:48:08 crc kubenswrapper[4707]: E0218 05:48:08.918150 4707 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.918937 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.919108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.920733 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.940828 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=5.940811133 podStartE2EDuration="5.940811133s" podCreationTimestamp="2026-02-18 05:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:08.940556987 +0000 UTC m=+25.588516121" watchObservedRunningTime="2026-02-18 05:48:08.940811133 +0000 UTC m=+25.588770267" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.942247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgdr\" (UniqueName: \"kubernetes.io/projected/568dadf7-7307-4dbd-a384-70470dc247e6-kube-api-access-4fgdr\") pod \"node-resolver-8vb52\" (UID: \"568dadf7-7307-4dbd-a384-70470dc247e6\") " pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.942279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/568dadf7-7307-4dbd-a384-70470dc247e6-hosts-file\") pod \"node-resolver-8vb52\" (UID: \"568dadf7-7307-4dbd-a384-70470dc247e6\") " pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.942305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1585c988-256e-4e50-8aea-b6820c419f11-serviceca\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.942320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1585c988-256e-4e50-8aea-b6820c419f11-host\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:08 crc kubenswrapper[4707]: I0218 05:48:08.942340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7qnj\" (UniqueName: \"kubernetes.io/projected/1585c988-256e-4e50-8aea-b6820c419f11-kube-api-access-v7qnj\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.001336 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 12:59:19.754583186 +0000 UTC Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.014147 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=7.014127894 podStartE2EDuration="7.014127894s" podCreationTimestamp="2026-02-18 05:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:09.013554179 +0000 UTC m=+25.661513313" watchObservedRunningTime="2026-02-18 05:48:09.014127894 +0000 UTC m=+25.662087038" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.014315 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-p9b84"] Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.014777 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.023586 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.025752 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.026011 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.026762 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.026965 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.038170 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jb6vz"] Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.038934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.041949 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042188 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-os-release\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-system-cni-dir\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46bf8216-a7c3-409f-90fc-e33145053129-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1585c988-256e-4e50-8aea-b6820c419f11-serviceca\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042811 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc127100-df64-48e7-bed0-620c796dd6b0-cni-binary-copy\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-cni-multus\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.042852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-cnibin\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-cnibin\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-socket-dir-parent\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-os-release\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-netns\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgdr\" (UniqueName: \"kubernetes.io/projected/568dadf7-7307-4dbd-a384-70470dc247e6-kube-api-access-4fgdr\") pod \"node-resolver-8vb52\" (UID: \"568dadf7-7307-4dbd-a384-70470dc247e6\") " pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-system-cni-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46bf8216-a7c3-409f-90fc-e33145053129-cni-binary-copy\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/568dadf7-7307-4dbd-a384-70470dc247e6-hosts-file\") pod \"node-resolver-8vb52\" (UID: \"568dadf7-7307-4dbd-a384-70470dc247e6\") " pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-conf-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-multus-certs\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-k8s-cni-cncf-io\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4zl\" (UniqueName: \"kubernetes.io/projected/fc127100-df64-48e7-bed0-620c796dd6b0-kube-api-access-ws4zl\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1585c988-256e-4e50-8aea-b6820c419f11-host\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/568dadf7-7307-4dbd-a384-70470dc247e6-hosts-file\") pod \"node-resolver-8vb52\" (UID: \"568dadf7-7307-4dbd-a384-70470dc247e6\") " pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-cni-bin\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1585c988-256e-4e50-8aea-b6820c419f11-host\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-kubelet\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc127100-df64-48e7-bed0-620c796dd6b0-multus-daemon-config\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-etc-kubernetes\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-hostroot\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7qnj\" (UniqueName: \"kubernetes.io/projected/1585c988-256e-4e50-8aea-b6820c419f11-kube-api-access-v7qnj\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-cni-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92pxl\" (UniqueName: \"kubernetes.io/projected/46bf8216-a7c3-409f-90fc-e33145053129-kube-api-access-92pxl\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.043939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1585c988-256e-4e50-8aea-b6820c419f11-serviceca\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.052352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:09 crc kubenswrapper[4707]: E0218 05:48:09.052502 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.062979 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.06296044 podStartE2EDuration="6.06296044s" podCreationTimestamp="2026-02-18 05:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:09.057829166 +0000 UTC m=+25.705788310" watchObservedRunningTime="2026-02-18 05:48:09.06296044 +0000 UTC m=+25.710919574" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.065428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgdr\" (UniqueName: \"kubernetes.io/projected/568dadf7-7307-4dbd-a384-70470dc247e6-kube-api-access-4fgdr\") pod \"node-resolver-8vb52\" (UID: \"568dadf7-7307-4dbd-a384-70470dc247e6\") " pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.117636 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-sbhs6"] Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.118210 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-snscc"] Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.118336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.118533 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:09 crc kubenswrapper[4707]: E0218 05:48:09.118607 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snscc" podUID="95bdd5db-88ec-41b6-9752-b5646a64f1ae" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.120235 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.121989 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.122052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.122176 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.122454 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46bf8216-a7c3-409f-90fc-e33145053129-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6lm\" (UniqueName: \"kubernetes.io/projected/95bdd5db-88ec-41b6-9752-b5646a64f1ae-kube-api-access-tv6lm\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-cnibin\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/185c5347-f458-48a7-bcc8-0b0fcd7b4850-mcd-auth-proxy-config\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc127100-df64-48e7-bed0-620c796dd6b0-cni-binary-copy\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-cni-multus\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-cnibin\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-socket-dir-parent\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/185c5347-f458-48a7-bcc8-0b0fcd7b4850-proxy-tls\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-os-release\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-netns\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-system-cni-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46bf8216-a7c3-409f-90fc-e33145053129-cni-binary-copy\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-conf-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-multus-certs\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfdkv\" (UniqueName: \"kubernetes.io/projected/185c5347-f458-48a7-bcc8-0b0fcd7b4850-kube-api-access-vfdkv\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-k8s-cni-cncf-io\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4zl\" (UniqueName: \"kubernetes.io/projected/fc127100-df64-48e7-bed0-620c796dd6b0-kube-api-access-ws4zl\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-cni-bin\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-kubelet\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc127100-df64-48e7-bed0-620c796dd6b0-multus-daemon-config\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-etc-kubernetes\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144811 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-hostroot\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-cni-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92pxl\" (UniqueName: \"kubernetes.io/projected/46bf8216-a7c3-409f-90fc-e33145053129-kube-api-access-92pxl\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-system-cni-dir\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/185c5347-f458-48a7-bcc8-0b0fcd7b4850-rootfs\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.144930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-os-release\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.145125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-os-release\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.145693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46bf8216-a7c3-409f-90fc-e33145053129-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.145744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-cnibin\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc127100-df64-48e7-bed0-620c796dd6b0-cni-binary-copy\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-cni-multus\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-cnibin\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-socket-dir-parent\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-os-release\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-netns\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-system-cni-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-k8s-cni-cncf-io\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-kubelet\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-hostroot\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-system-cni-dir\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.147006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-cni-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.147003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-etc-kubernetes\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.147011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-run-multus-certs\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.147055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc127100-df64-48e7-bed0-620c796dd6b0-multus-daemon-config\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.146875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-multus-conf-dir\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.147085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc127100-df64-48e7-bed0-620c796dd6b0-host-var-lib-cni-bin\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.147374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46bf8216-a7c3-409f-90fc-e33145053129-cni-binary-copy\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.147696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46bf8216-a7c3-409f-90fc-e33145053129-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.163013 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92pxl\" (UniqueName: \"kubernetes.io/projected/46bf8216-a7c3-409f-90fc-e33145053129-kube-api-access-92pxl\") pod \"multus-additional-cni-plugins-jb6vz\" (UID: \"46bf8216-a7c3-409f-90fc-e33145053129\") " pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.164029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4zl\" (UniqueName: \"kubernetes.io/projected/fc127100-df64-48e7-bed0-620c796dd6b0-kube-api-access-ws4zl\") pod \"multus-p9b84\" (UID: \"fc127100-df64-48e7-bed0-620c796dd6b0\") " pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.222551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d32e741a7aac6bba87237c583b518af67c767144fc0b5e0d904dbd9e724a027f"} Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.226021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8vb52" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.237993 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r5qsf"] Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.239035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.243307 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.244418 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.244624 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.244660 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.245569 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.245588 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6lm\" (UniqueName: \"kubernetes.io/projected/95bdd5db-88ec-41b6-9752-b5646a64f1ae-kube-api-access-tv6lm\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-config\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/185c5347-f458-48a7-bcc8-0b0fcd7b4850-mcd-auth-proxy-config\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246164 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-var-lib-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/185c5347-f458-48a7-bcc8-0b0fcd7b4850-proxy-tls\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-ovn-kubernetes\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-kubelet\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-systemd\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-script-lib\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-node-log\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-bin\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-env-overrides\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-netns\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-log-socket\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-netd\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfdkv\" (UniqueName: \"kubernetes.io/projected/185c5347-f458-48a7-bcc8-0b0fcd7b4850-kube-api-access-vfdkv\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-slash\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4p6s\" (UniqueName: \"kubernetes.io/projected/2c00624a-9b7d-4593-821c-c76976b1c192-kube-api-access-z4p6s\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/185c5347-f458-48a7-bcc8-0b0fcd7b4850-rootfs\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-systemd-units\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-ovn\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c00624a-9b7d-4593-821c-c76976b1c192-ovn-node-metrics-cert\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-etc-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.246952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/185c5347-f458-48a7-bcc8-0b0fcd7b4850-mcd-auth-proxy-config\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: E0218 05:48:09.247380 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:09 crc kubenswrapper[4707]: E0218 05:48:09.247428 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs podName:95bdd5db-88ec-41b6-9752-b5646a64f1ae nodeName:}" failed. No retries permitted until 2026-02-18 05:48:09.747413588 +0000 UTC m=+26.395372712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs") pod "network-metrics-daemon-snscc" (UID: "95bdd5db-88ec-41b6-9752-b5646a64f1ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.247841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/185c5347-f458-48a7-bcc8-0b0fcd7b4850-rootfs\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.249843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/185c5347-f458-48a7-bcc8-0b0fcd7b4850-proxy-tls\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.251478 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.263610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6lm\" (UniqueName: \"kubernetes.io/projected/95bdd5db-88ec-41b6-9752-b5646a64f1ae-kube-api-access-tv6lm\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.270831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfdkv\" (UniqueName: \"kubernetes.io/projected/185c5347-f458-48a7-bcc8-0b0fcd7b4850-kube-api-access-vfdkv\") pod \"machine-config-daemon-sbhs6\" (UID: \"185c5347-f458-48a7-bcc8-0b0fcd7b4850\") " pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.328459 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p9b84" Feb 18 05:48:09 crc kubenswrapper[4707]: W0218 05:48:09.339380 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc127100_df64_48e7_bed0_620c796dd6b0.slice/crio-3e9883eaee20ba00eb75fd649b1f8a5f254a797ea92e309e5a2f7480f5809aca WatchSource:0}: Error finding container 3e9883eaee20ba00eb75fd649b1f8a5f254a797ea92e309e5a2f7480f5809aca: Status 404 returned error can't find the container with id 3e9883eaee20ba00eb75fd649b1f8a5f254a797ea92e309e5a2f7480f5809aca Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4p6s\" (UniqueName: \"kubernetes.io/projected/2c00624a-9b7d-4593-821c-c76976b1c192-kube-api-access-z4p6s\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-slash\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-systemd-units\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-ovn\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-etc-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c00624a-9b7d-4593-821c-c76976b1c192-ovn-node-metrics-cert\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-systemd-units\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-config\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-slash\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-etc-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-var-lib-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-ovn\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-var-lib-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-openvswitch\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-ovn-kubernetes\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-ovn-kubernetes\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-kubelet\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-systemd\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-node-log\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-script-lib\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-kubelet\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347822 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-bin\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-node-log\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-env-overrides\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-systemd\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-log-socket\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-bin\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-netd\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-netns\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.347993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-netns\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.348035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-log-socket\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.348072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-netd\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.348357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-config\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.348472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-script-lib\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.348638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-env-overrides\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.350092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.351296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c00624a-9b7d-4593-821c-c76976b1c192-ovn-node-metrics-cert\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: W0218 05:48:09.366138 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46bf8216_a7c3_409f_90fc_e33145053129.slice/crio-af4248695620eba7b8583eb911e68a32aa3aabbd238dafd23506f0091de20f85 WatchSource:0}: Error finding container af4248695620eba7b8583eb911e68a32aa3aabbd238dafd23506f0091de20f85: Status 404 returned error can't find the container with id af4248695620eba7b8583eb911e68a32aa3aabbd238dafd23506f0091de20f85 Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.368624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4p6s\" (UniqueName: \"kubernetes.io/projected/2c00624a-9b7d-4593-821c-c76976b1c192-kube-api-access-z4p6s\") pod \"ovnkube-node-r5qsf\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.429955 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.569086 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:09 crc kubenswrapper[4707]: W0218 05:48:09.592719 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c00624a_9b7d_4593_821c_c76976b1c192.slice/crio-fbe7c2ce9136d28d74ce63feb2a6d11a4eab0aaee3a6d51f98a8653d1bf6c11f WatchSource:0}: Error finding container fbe7c2ce9136d28d74ce63feb2a6d11a4eab0aaee3a6d51f98a8653d1bf6c11f: Status 404 returned error can't find the container with id fbe7c2ce9136d28d74ce63feb2a6d11a4eab0aaee3a6d51f98a8653d1bf6c11f Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.659228 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b"] Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.659601 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.662025 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.664409 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.716660 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-18 05:43:08 +0000 UTC, rotation deadline is 2026-11-05 00:21:46.807155677 +0000 UTC Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.717089 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6234h33m37.09006946s for next certificate rotation Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.751852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.751934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4ls\" (UniqueName: \"kubernetes.io/projected/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-kube-api-access-vg4ls\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.751969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.752001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.752043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:09 crc kubenswrapper[4707]: E0218 05:48:09.752157 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:09 crc kubenswrapper[4707]: E0218 05:48:09.752215 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs podName:95bdd5db-88ec-41b6-9752-b5646a64f1ae nodeName:}" failed. No retries permitted until 2026-02-18 05:48:10.752197622 +0000 UTC m=+27.400156766 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs") pod "network-metrics-daemon-snscc" (UID: "95bdd5db-88ec-41b6-9752-b5646a64f1ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.853365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.853465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4ls\" (UniqueName: \"kubernetes.io/projected/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-kube-api-access-vg4ls\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.853507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.853540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.854361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.854724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.860991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.875553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4ls\" (UniqueName: \"kubernetes.io/projected/1d8d08ca-bd7f-43b0-8bf1-972ce667fe57-kube-api-access-vg4ls\") pod \"ovnkube-control-plane-749d76644c-xxh8b\" (UID: \"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:09 crc kubenswrapper[4707]: I0218 05:48:09.990201 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.001560 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 03:18:16.067903173 +0000 UTC Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.052987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.052998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:10 crc kubenswrapper[4707]: E0218 05:48:10.053140 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:10 crc kubenswrapper[4707]: E0218 05:48:10.053220 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:10 crc kubenswrapper[4707]: E0218 05:48:10.059046 4707 projected.go:288] Couldn't get configMap openshift-image-registry/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:10 crc kubenswrapper[4707]: E0218 05:48:10.059095 4707 projected.go:194] Error preparing data for projected volume kube-api-access-v7qnj for pod openshift-image-registry/node-ca-g4w9k: failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:10 crc kubenswrapper[4707]: E0218 05:48:10.059194 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1585c988-256e-4e50-8aea-b6820c419f11-kube-api-access-v7qnj podName:1585c988-256e-4e50-8aea-b6820c419f11 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:10.559170576 +0000 UTC m=+27.207129780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v7qnj" (UniqueName: "kubernetes.io/projected/1585c988-256e-4e50-8aea-b6820c419f11-kube-api-access-v7qnj") pod "node-ca-g4w9k" (UID: "1585c988-256e-4e50-8aea-b6820c419f11") : failed to sync configmap cache: timed out waiting for the condition Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.179294 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.180824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.180852 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.180860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.180953 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.187915 4707 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.188174 4707 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.189060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.189080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.189087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.189100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.189109 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T05:48:10Z","lastTransitionTime":"2026-02-18T05:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.227030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8vb52" event={"ID":"568dadf7-7307-4dbd-a384-70470dc247e6","Type":"ContainerStarted","Data":"c42a1732f71b28cdaa8d3c2a9b56e9abcf159656f4ca807bfd04f08371526579"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.227096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8vb52" event={"ID":"568dadf7-7307-4dbd-a384-70470dc247e6","Type":"ContainerStarted","Data":"cfa6dd1a4145bf59d2682369ecf7a0d87797659222ed3d3bcdbb91b8b503b34d"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.229646 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b" exitCode=0 Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.229717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.229751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"fbe7c2ce9136d28d74ce63feb2a6d11a4eab0aaee3a6d51f98a8653d1bf6c11f"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.233042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9b84" event={"ID":"fc127100-df64-48e7-bed0-620c796dd6b0","Type":"ContainerStarted","Data":"c21be25ee4dcb50109caffd8a9e46273adf7cf635882c8c70a9d4012c55dbb17"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.233088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9b84" event={"ID":"fc127100-df64-48e7-bed0-620c796dd6b0","Type":"ContainerStarted","Data":"3e9883eaee20ba00eb75fd649b1f8a5f254a797ea92e309e5a2f7480f5809aca"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.237446 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.237468 4707 generic.go:334] "Generic (PLEG): container finished" podID="46bf8216-a7c3-409f-90fc-e33145053129" containerID="5898b376d36f92add961e562691db710baaafa9c3b38b1245600f8fe386e0788" exitCode=0 Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.237562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerDied","Data":"5898b376d36f92add961e562691db710baaafa9c3b38b1245600f8fe386e0788"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.237590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerStarted","Data":"af4248695620eba7b8583eb911e68a32aa3aabbd238dafd23506f0091de20f85"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.242734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" event={"ID":"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57","Type":"ContainerStarted","Data":"77f39b7acbc33e90296993f8a10b3645df6efe05ae6b86f5c1ef8b695213f898"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.242781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" event={"ID":"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57","Type":"ContainerStarted","Data":"3a0f115c178a2c18d0356398c01ca84666ecbad249912df259b8f3612060a151"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.252885 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8vb52" podStartSLOduration=2.252858717 podStartE2EDuration="2.252858717s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:10.244114777 +0000 UTC m=+26.892073901" watchObservedRunningTime="2026-02-18 05:48:10.252858717 +0000 UTC m=+26.900817871" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.253351 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd"] Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.253774 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.255851 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.256300 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.256458 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.256634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"54ae1c88a5fc550eed48cc5b08d7043e5f83c142fb6017e3359dce55f0b1e55b"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.256672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"bb616e6ef6d668e0c27124102abb5c64f761976e7550e81d8eb8d94a07fb5fd4"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.256688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"0a14b149490aadf570898b38aa94482daae8d38ad1e0f157e7a8d9c5008c1bc9"} Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.258079 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.266018 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p9b84" podStartSLOduration=2.265998123 podStartE2EDuration="2.265998123s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:10.264696829 +0000 UTC m=+26.912655963" watchObservedRunningTime="2026-02-18 05:48:10.265998123 +0000 UTC m=+26.913957257" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.342029 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podStartSLOduration=2.342009445 podStartE2EDuration="2.342009445s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:10.34029533 +0000 UTC m=+26.988254474" watchObservedRunningTime="2026-02-18 05:48:10.342009445 +0000 UTC m=+26.989968569" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.358130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.358218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.358636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.358706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.358721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.459519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.459636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.459663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.459774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.459847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.459898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.459934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.460859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.463269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.476333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ce1c7ec-4dfd-4da7-89e9-c938767b1945-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6fldd\" (UID: \"0ce1c7ec-4dfd-4da7-89e9-c938767b1945\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.560773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7qnj\" (UniqueName: \"kubernetes.io/projected/1585c988-256e-4e50-8aea-b6820c419f11-kube-api-access-v7qnj\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.564280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7qnj\" (UniqueName: \"kubernetes.io/projected/1585c988-256e-4e50-8aea-b6820c419f11-kube-api-access-v7qnj\") pod \"node-ca-g4w9k\" (UID: \"1585c988-256e-4e50-8aea-b6820c419f11\") " pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.603035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" Feb 18 05:48:10 crc kubenswrapper[4707]: W0218 05:48:10.633404 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ce1c7ec_4dfd_4da7_89e9_c938767b1945.slice/crio-d90503bcd7dec0e86ca3166a901cf9c2ac2201b798f4336299ab775443b58d4e WatchSource:0}: Error finding container d90503bcd7dec0e86ca3166a901cf9c2ac2201b798f4336299ab775443b58d4e: Status 404 returned error can't find the container with id d90503bcd7dec0e86ca3166a901cf9c2ac2201b798f4336299ab775443b58d4e Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.732191 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4w9k" Feb 18 05:48:10 crc kubenswrapper[4707]: W0218 05:48:10.742884 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1585c988_256e_4e50_8aea_b6820c419f11.slice/crio-dacca31290d38f3b1a53ae114472c931ab482b613756b56703b09db87c54b861 WatchSource:0}: Error finding container dacca31290d38f3b1a53ae114472c931ab482b613756b56703b09db87c54b861: Status 404 returned error can't find the container with id dacca31290d38f3b1a53ae114472c931ab482b613756b56703b09db87c54b861 Feb 18 05:48:10 crc kubenswrapper[4707]: I0218 05:48:10.763139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:10 crc kubenswrapper[4707]: E0218 05:48:10.763375 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:10 crc kubenswrapper[4707]: E0218 05:48:10.763576 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs podName:95bdd5db-88ec-41b6-9752-b5646a64f1ae nodeName:}" failed. No retries permitted until 2026-02-18 05:48:12.763560467 +0000 UTC m=+29.411519601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs") pod "network-metrics-daemon-snscc" (UID: "95bdd5db-88ec-41b6-9752-b5646a64f1ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.002494 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 16:29:46.317932793 +0000 UTC Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.002956 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.014420 4707 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.052350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.052470 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.052897 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.052976 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snscc" podUID="95bdd5db-88ec-41b6-9752-b5646a64f1ae" Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.263675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.263729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.263745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.263757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.263769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.265255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" event={"ID":"0ce1c7ec-4dfd-4da7-89e9-c938767b1945","Type":"ContainerStarted","Data":"da8f2ceae5b766e321690bee1a1571219dbd55d6c0a9c813076661347d44c6a9"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.265361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" event={"ID":"0ce1c7ec-4dfd-4da7-89e9-c938767b1945","Type":"ContainerStarted","Data":"d90503bcd7dec0e86ca3166a901cf9c2ac2201b798f4336299ab775443b58d4e"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.268310 4707 generic.go:334] "Generic (PLEG): container finished" podID="46bf8216-a7c3-409f-90fc-e33145053129" containerID="41204a4cbb7a48579277fd7970753b6fc2ac7a3909b4bfed414bb6985467f874" exitCode=0 Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.268374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerDied","Data":"41204a4cbb7a48579277fd7970753b6fc2ac7a3909b4bfed414bb6985467f874"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.271983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" event={"ID":"1d8d08ca-bd7f-43b0-8bf1-972ce667fe57","Type":"ContainerStarted","Data":"0702610542cc9dc283557f97b4065d78d6dda7affb9bc133faa40d0123cf43fb"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.273850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4w9k" event={"ID":"1585c988-256e-4e50-8aea-b6820c419f11","Type":"ContainerStarted","Data":"dacca31290d38f3b1a53ae114472c931ab482b613756b56703b09db87c54b861"} Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.279235 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6fldd" podStartSLOduration=3.279212266 podStartE2EDuration="3.279212266s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:11.278807796 +0000 UTC m=+27.926766930" watchObservedRunningTime="2026-02-18 05:48:11.279212266 +0000 UTC m=+27.927171400" Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.321688 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxh8b" podStartSLOduration=2.321649104 podStartE2EDuration="2.321649104s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:11.320282538 +0000 UTC m=+27.968241672" watchObservedRunningTime="2026-02-18 05:48:11.321649104 +0000 UTC m=+27.969608278" Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.486320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.486708 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:19.48667381 +0000 UTC m=+36.134632964 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.588336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.588392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.588423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:11 crc kubenswrapper[4707]: I0218 05:48:11.588459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588550 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588570 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588606 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:19.588588964 +0000 UTC m=+36.236548098 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588621 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:19.588613235 +0000 UTC m=+36.236572369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588688 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588726 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588730 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588820 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588844 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588747 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.588969 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:19.588938113 +0000 UTC m=+36.236897287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:11 crc kubenswrapper[4707]: E0218 05:48:11.589004 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:19.588990114 +0000 UTC m=+36.236949278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:12 crc kubenswrapper[4707]: I0218 05:48:12.052809 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:12 crc kubenswrapper[4707]: I0218 05:48:12.052962 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:12 crc kubenswrapper[4707]: E0218 05:48:12.053177 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:12 crc kubenswrapper[4707]: E0218 05:48:12.053268 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:12 crc kubenswrapper[4707]: I0218 05:48:12.281624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} Feb 18 05:48:12 crc kubenswrapper[4707]: I0218 05:48:12.283996 4707 generic.go:334] "Generic (PLEG): container finished" podID="46bf8216-a7c3-409f-90fc-e33145053129" containerID="2cdf8476c92dd408e66028d4635c9593a929c6bf3b02e953a497e00f88ed85eb" exitCode=0 Feb 18 05:48:12 crc kubenswrapper[4707]: I0218 05:48:12.284074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerDied","Data":"2cdf8476c92dd408e66028d4635c9593a929c6bf3b02e953a497e00f88ed85eb"} Feb 18 05:48:12 crc kubenswrapper[4707]: I0218 05:48:12.286157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4w9k" event={"ID":"1585c988-256e-4e50-8aea-b6820c419f11","Type":"ContainerStarted","Data":"ece78897b0a30e6fdb3da6bbc0a7c56a5539e41dbda765cc3169021e60e75368"} Feb 18 05:48:12 crc kubenswrapper[4707]: I0218 05:48:12.800766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:12 crc kubenswrapper[4707]: E0218 05:48:12.800962 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:12 crc kubenswrapper[4707]: E0218 05:48:12.801046 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs podName:95bdd5db-88ec-41b6-9752-b5646a64f1ae nodeName:}" failed. No retries permitted until 2026-02-18 05:48:16.801022204 +0000 UTC m=+33.448981368 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs") pod "network-metrics-daemon-snscc" (UID: "95bdd5db-88ec-41b6-9752-b5646a64f1ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:13 crc kubenswrapper[4707]: I0218 05:48:13.052622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:13 crc kubenswrapper[4707]: E0218 05:48:13.054127 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snscc" podUID="95bdd5db-88ec-41b6-9752-b5646a64f1ae" Feb 18 05:48:13 crc kubenswrapper[4707]: I0218 05:48:13.052657 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:13 crc kubenswrapper[4707]: E0218 05:48:13.054685 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:13 crc kubenswrapper[4707]: I0218 05:48:13.291872 4707 generic.go:334] "Generic (PLEG): container finished" podID="46bf8216-a7c3-409f-90fc-e33145053129" containerID="fc672eea44de723e8f3671cae9e28c27d5584b7bd44de69e2e8538b093c9b61f" exitCode=0 Feb 18 05:48:13 crc kubenswrapper[4707]: I0218 05:48:13.292025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerDied","Data":"fc672eea44de723e8f3671cae9e28c27d5584b7bd44de69e2e8538b093c9b61f"} Feb 18 05:48:13 crc kubenswrapper[4707]: I0218 05:48:13.329723 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g4w9k" podStartSLOduration=5.329689657 podStartE2EDuration="5.329689657s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:12.323928139 +0000 UTC m=+28.971887293" watchObservedRunningTime="2026-02-18 05:48:13.329689657 +0000 UTC m=+29.977648821" Feb 18 05:48:13 crc kubenswrapper[4707]: I0218 05:48:13.812327 4707 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 18 05:48:14 crc kubenswrapper[4707]: I0218 05:48:14.052046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:14 crc kubenswrapper[4707]: I0218 05:48:14.052094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:14 crc kubenswrapper[4707]: E0218 05:48:14.052971 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:14 crc kubenswrapper[4707]: E0218 05:48:14.053065 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:14 crc kubenswrapper[4707]: I0218 05:48:14.299381 4707 generic.go:334] "Generic (PLEG): container finished" podID="46bf8216-a7c3-409f-90fc-e33145053129" containerID="3464392c852c91698c18ed06d0f2f06b0b29cd33c4993ae02c498d5835cf61d2" exitCode=0 Feb 18 05:48:14 crc kubenswrapper[4707]: I0218 05:48:14.299458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerDied","Data":"3464392c852c91698c18ed06d0f2f06b0b29cd33c4993ae02c498d5835cf61d2"} Feb 18 05:48:14 crc kubenswrapper[4707]: I0218 05:48:14.303704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} Feb 18 05:48:15 crc kubenswrapper[4707]: I0218 05:48:15.052833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:15 crc kubenswrapper[4707]: I0218 05:48:15.052885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:15 crc kubenswrapper[4707]: E0218 05:48:15.052958 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:15 crc kubenswrapper[4707]: E0218 05:48:15.053071 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snscc" podUID="95bdd5db-88ec-41b6-9752-b5646a64f1ae" Feb 18 05:48:15 crc kubenswrapper[4707]: I0218 05:48:15.312937 4707 generic.go:334] "Generic (PLEG): container finished" podID="46bf8216-a7c3-409f-90fc-e33145053129" containerID="e86e005afb71cf32972ceae7d6040123cf39c6ec6d01e6e447bce51be4363b62" exitCode=0 Feb 18 05:48:15 crc kubenswrapper[4707]: I0218 05:48:15.313000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerDied","Data":"e86e005afb71cf32972ceae7d6040123cf39c6ec6d01e6e447bce51be4363b62"} Feb 18 05:48:16 crc kubenswrapper[4707]: I0218 05:48:16.062827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:16 crc kubenswrapper[4707]: E0218 05:48:16.063646 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:16 crc kubenswrapper[4707]: I0218 05:48:16.063149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:16 crc kubenswrapper[4707]: E0218 05:48:16.064392 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:16 crc kubenswrapper[4707]: I0218 05:48:16.324026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerStarted","Data":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} Feb 18 05:48:16 crc kubenswrapper[4707]: I0218 05:48:16.328502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" event={"ID":"46bf8216-a7c3-409f-90fc-e33145053129","Type":"ContainerStarted","Data":"75001b88a2bfcd9f88a19231e422c0214c5b79397cd1963d393c7018ab4b81f3"} Feb 18 05:48:16 crc kubenswrapper[4707]: I0218 05:48:16.355003 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jb6vz" podStartSLOduration=7.35498028 podStartE2EDuration="7.35498028s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:16.35308154 +0000 UTC m=+33.001040674" watchObservedRunningTime="2026-02-18 05:48:16.35498028 +0000 UTC m=+33.002939414" Feb 18 05:48:16 crc kubenswrapper[4707]: I0218 05:48:16.873374 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:16 crc kubenswrapper[4707]: E0218 05:48:16.873681 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:16 crc kubenswrapper[4707]: E0218 05:48:16.873893 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs podName:95bdd5db-88ec-41b6-9752-b5646a64f1ae nodeName:}" failed. No retries permitted until 2026-02-18 05:48:24.873876535 +0000 UTC m=+41.521835669 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs") pod "network-metrics-daemon-snscc" (UID: "95bdd5db-88ec-41b6-9752-b5646a64f1ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 05:48:17 crc kubenswrapper[4707]: I0218 05:48:17.052088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:17 crc kubenswrapper[4707]: I0218 05:48:17.052145 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:17 crc kubenswrapper[4707]: E0218 05:48:17.052365 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snscc" podUID="95bdd5db-88ec-41b6-9752-b5646a64f1ae" Feb 18 05:48:17 crc kubenswrapper[4707]: E0218 05:48:17.052488 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:17 crc kubenswrapper[4707]: I0218 05:48:17.333752 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:17 crc kubenswrapper[4707]: I0218 05:48:17.376821 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podStartSLOduration=8.37678384 podStartE2EDuration="8.37678384s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:17.374930391 +0000 UTC m=+34.022889535" watchObservedRunningTime="2026-02-18 05:48:17.37678384 +0000 UTC m=+34.024742974" Feb 18 05:48:17 crc kubenswrapper[4707]: I0218 05:48:17.423535 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:18 crc kubenswrapper[4707]: I0218 05:48:18.052458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:18 crc kubenswrapper[4707]: I0218 05:48:18.052557 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:18 crc kubenswrapper[4707]: E0218 05:48:18.052669 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:18 crc kubenswrapper[4707]: E0218 05:48:18.052940 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:18 crc kubenswrapper[4707]: I0218 05:48:18.336857 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:18 crc kubenswrapper[4707]: I0218 05:48:18.337417 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:18 crc kubenswrapper[4707]: I0218 05:48:18.361709 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:18 crc kubenswrapper[4707]: I0218 05:48:18.578470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-snscc"] Feb 18 05:48:18 crc kubenswrapper[4707]: I0218 05:48:18.578660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:18 crc kubenswrapper[4707]: E0218 05:48:18.578768 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snscc" podUID="95bdd5db-88ec-41b6-9752-b5646a64f1ae" Feb 18 05:48:19 crc kubenswrapper[4707]: I0218 05:48:19.052132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.052382 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:19 crc kubenswrapper[4707]: I0218 05:48:19.340124 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:19 crc kubenswrapper[4707]: I0218 05:48:19.506048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.506285 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:35.50624056 +0000 UTC m=+52.154199734 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:19 crc kubenswrapper[4707]: I0218 05:48:19.631717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:19 crc kubenswrapper[4707]: I0218 05:48:19.631767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:19 crc kubenswrapper[4707]: I0218 05:48:19.631897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:19 crc kubenswrapper[4707]: I0218 05:48:19.631932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632028 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632087 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:35.632070424 +0000 UTC m=+52.280029558 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632479 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632513 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:35.632503665 +0000 UTC m=+52.280462799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632522 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632555 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632569 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632593 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632610 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632622 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632631 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:35.632610748 +0000 UTC m=+52.280569872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:19 crc kubenswrapper[4707]: E0218 05:48:19.632701 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 05:48:35.632641829 +0000 UTC m=+52.280601073 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 05:48:20 crc kubenswrapper[4707]: I0218 05:48:20.053259 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:20 crc kubenswrapper[4707]: I0218 05:48:20.053440 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:20 crc kubenswrapper[4707]: I0218 05:48:20.053648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:20 crc kubenswrapper[4707]: E0218 05:48:20.053448 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 05:48:20 crc kubenswrapper[4707]: E0218 05:48:20.053857 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 05:48:20 crc kubenswrapper[4707]: E0218 05:48:20.053891 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snscc" podUID="95bdd5db-88ec-41b6-9752-b5646a64f1ae" Feb 18 05:48:20 crc kubenswrapper[4707]: I0218 05:48:20.344327 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.052092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:21 crc kubenswrapper[4707]: E0218 05:48:21.052394 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.808698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.808951 4707 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.881366 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k2dhh"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.882688 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.887044 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.887281 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.887500 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.887645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.888290 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.888497 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.888614 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.888767 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.888922 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.891065 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.892455 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-74wxc"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.892864 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.902347 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.902810 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.902935 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.903070 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.903189 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.903225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.903356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.903548 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.903979 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.904420 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-g4hvr"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.904657 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.905122 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.905198 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.905757 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.905960 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.906312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.907236 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.909313 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.909557 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.910099 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.913162 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.913182 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.915537 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.915618 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.916348 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.916759 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.917568 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.917625 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.925317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.925543 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.925668 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.925951 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.925958 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.926268 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.926470 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.927900 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928099 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928112 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928142 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928249 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-74wxc"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928439 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75w4w"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.929035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928452 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.929490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.929512 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.928619 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.929896 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.929963 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.930097 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.930165 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k2dhh"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.930481 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.930787 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.931593 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.931979 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cbn6t"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.932295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.932667 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.933176 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.937138 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.937313 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.938269 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.938436 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.938955 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.939242 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.939420 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.939587 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.939848 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.939988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.940143 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.940410 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.940632 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.940959 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.941582 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.941693 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.941857 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.941928 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.942043 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.942102 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.942252 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.942471 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.952860 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.953558 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-g4hvr"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.961505 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.962269 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.965957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-config\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.966003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-client-ca\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.966026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7qps\" (UniqueName: \"kubernetes.io/projected/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-kube-api-access-p7qps\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.966047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-serving-cert\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.976942 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sv9z4"] Feb 18 05:48:21 crc kubenswrapper[4707]: I0218 05:48:21.980542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.002313 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.002953 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7f42d"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.003481 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.003563 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.004151 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.006943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.006998 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.007180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.007287 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.007291 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.007392 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.007511 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.010548 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.015224 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-f9dnq"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.016375 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f9dnq" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.017118 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.024449 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.024696 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.025556 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.029026 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.029901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.031224 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.031876 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.032376 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-m2l7m"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.032410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.033034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.035438 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.036249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.036421 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.036532 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.036636 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.037272 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.040279 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.040362 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.040686 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.040826 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.041631 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.043092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.048658 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8258q"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.049220 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.049456 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.051001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.051510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.051569 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.051981 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r74s4"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.052568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.053312 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-b87f8"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.059023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.061164 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.062428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.062762 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.063475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.065620 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-config\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-config\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-audit-policies\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq6qh\" (UniqueName: \"kubernetes.io/projected/b7a4eced-46b2-4002-964d-490b0ad2acd3-kube-api-access-fq6qh\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91397883-fd02-4070-aa06-18da845bbeeb-serving-cert\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-image-import-ca\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.067924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81046e41-7512-4b56-b0d7-ccfa783f973a-config\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bcjs\" (UniqueName: \"kubernetes.io/projected/ec70ffb1-f091-47ed-b947-6af13fd6d34f-kube-api-access-7bcjs\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-etcd-client\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-audit\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-encryption-config\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/384cd184-04e6-4505-b0f3-a2e367bb6dcd-audit-dir\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-client-ca\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81046e41-7512-4b56-b0d7-ccfa783f973a-machine-approver-tls\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-config\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.068970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkmr\" (UniqueName: \"kubernetes.io/projected/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-kube-api-access-mmkmr\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-serving-cert\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9k49\" (UniqueName: \"kubernetes.io/projected/e80f686a-6a1b-433b-b994-a10eec4758ed-kube-api-access-w9k49\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-dir\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-config\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krm6t\" (UniqueName: \"kubernetes.io/projected/91397883-fd02-4070-aa06-18da845bbeeb-kube-api-access-krm6t\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-etcd-client\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7a4eced-46b2-4002-964d-490b0ad2acd3-images\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-config\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7a4eced-46b2-4002-964d-490b0ad2acd3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6z8b\" (UniqueName: \"kubernetes.io/projected/24beed91-e86e-4dae-a372-ea06be0cefb9-kube-api-access-v6z8b\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-encryption-config\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-client-ca\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh85c\" (UniqueName: \"kubernetes.io/projected/81046e41-7512-4b56-b0d7-ccfa783f973a-kube-api-access-dh85c\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc2dk\" (UniqueName: \"kubernetes.io/projected/384cd184-04e6-4505-b0f3-a2e367bb6dcd-kube-api-access-qc2dk\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a4eced-46b2-4002-964d-490b0ad2acd3-config\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z454t\" (UniqueName: \"kubernetes.io/projected/feac3e8b-68a4-4bf4-aabc-e76ef9670361-kube-api-access-z454t\") pod \"cluster-samples-operator-665b6dd947-c95nl\" (UID: \"feac3e8b-68a4-4bf4-aabc-e76ef9670361\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e80f686a-6a1b-433b-b994-a10eec4758ed-node-pullsecrets\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7qps\" (UniqueName: \"kubernetes.io/projected/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-kube-api-access-p7qps\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-etcd-serving-ca\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-serving-cert\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24beed91-e86e-4dae-a372-ea06be0cefb9-serving-cert\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e80f686a-6a1b-433b-b994-a10eec4758ed-audit-dir\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.069938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.070001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-service-ca-bundle\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.070024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feac3e8b-68a4-4bf4-aabc-e76ef9670361-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c95nl\" (UID: \"feac3e8b-68a4-4bf4-aabc-e76ef9670361\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.070095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-serving-cert\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.070168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-policies\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.070234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81046e41-7512-4b56-b0d7-ccfa783f973a-auth-proxy-config\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.070359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-client-ca\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.078354 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.084713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-serving-cert\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.091854 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.092044 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.092604 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.092974 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7qhxj"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.093222 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.093475 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.093809 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.094120 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.094228 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.094869 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.095390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.095743 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.096224 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.097317 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.097678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.099900 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.100234 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.100684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.100859 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.101402 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.101756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.102819 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.103216 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.105073 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.105257 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.105643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.106458 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6jdbh"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.106923 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.108469 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-879zs"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.109298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.109822 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.110624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.112047 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.112707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.114358 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hgb9s"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.114955 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7g9qp"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.115069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.115319 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.116081 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.116703 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.117384 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kjq22"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.118302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.118782 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g8pt7"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.119219 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.120468 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75w4w"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.122997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7f42d"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.124404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cbn6t"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.124673 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.127557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.130516 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.132788 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.134342 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.135772 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.138201 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.145308 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.150883 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.157487 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sv9z4"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.159467 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f9dnq"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.161116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.163306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.164562 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.166423 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m2l7m"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.168074 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.170552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.170886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq6qh\" (UniqueName: \"kubernetes.io/projected/b7a4eced-46b2-4002-964d-490b0ad2acd3-kube-api-access-fq6qh\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.170915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91397883-fd02-4070-aa06-18da845bbeeb-serving-cert\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.170936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.170957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-image-import-ca\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.170972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81046e41-7512-4b56-b0d7-ccfa783f973a-config\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.170988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bcjs\" (UniqueName: \"kubernetes.io/projected/ec70ffb1-f091-47ed-b947-6af13fd6d34f-kube-api-access-7bcjs\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-etcd-client\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-audit\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-encryption-config\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/384cd184-04e6-4505-b0f3-a2e367bb6dcd-audit-dir\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-client-ca\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81046e41-7512-4b56-b0d7-ccfa783f973a-machine-approver-tls\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-config\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkmr\" (UniqueName: \"kubernetes.io/projected/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-kube-api-access-mmkmr\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-serving-cert\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9k49\" (UniqueName: \"kubernetes.io/projected/e80f686a-6a1b-433b-b994-a10eec4758ed-kube-api-access-w9k49\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-dir\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-config\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krm6t\" (UniqueName: \"kubernetes.io/projected/91397883-fd02-4070-aa06-18da845bbeeb-kube-api-access-krm6t\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-etcd-client\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7a4eced-46b2-4002-964d-490b0ad2acd3-images\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7a4eced-46b2-4002-964d-490b0ad2acd3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6z8b\" (UniqueName: \"kubernetes.io/projected/24beed91-e86e-4dae-a372-ea06be0cefb9-kube-api-access-v6z8b\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-encryption-config\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh85c\" (UniqueName: \"kubernetes.io/projected/81046e41-7512-4b56-b0d7-ccfa783f973a-kube-api-access-dh85c\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc2dk\" (UniqueName: \"kubernetes.io/projected/384cd184-04e6-4505-b0f3-a2e367bb6dcd-kube-api-access-qc2dk\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a4eced-46b2-4002-964d-490b0ad2acd3-config\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z454t\" (UniqueName: \"kubernetes.io/projected/feac3e8b-68a4-4bf4-aabc-e76ef9670361-kube-api-access-z454t\") pod \"cluster-samples-operator-665b6dd947-c95nl\" (UID: \"feac3e8b-68a4-4bf4-aabc-e76ef9670361\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e80f686a-6a1b-433b-b994-a10eec4758ed-node-pullsecrets\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-etcd-serving-ca\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24beed91-e86e-4dae-a372-ea06be0cefb9-serving-cert\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e80f686a-6a1b-433b-b994-a10eec4758ed-audit-dir\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-service-ca-bundle\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feac3e8b-68a4-4bf4-aabc-e76ef9670361-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c95nl\" (UID: \"feac3e8b-68a4-4bf4-aabc-e76ef9670361\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-serving-cert\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171819 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-policies\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81046e41-7512-4b56-b0d7-ccfa783f973a-auth-proxy-config\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-config\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-audit-policies\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.171956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.172975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7a4eced-46b2-4002-964d-490b0ad2acd3-images\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.173007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.173341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.173407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.173444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e80f686a-6a1b-433b-b994-a10eec4758ed-node-pullsecrets\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.173717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-config\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.174058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-policies\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.174335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a4eced-46b2-4002-964d-490b0ad2acd3-config\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.174673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.174725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e80f686a-6a1b-433b-b994-a10eec4758ed-audit-dir\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.175272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.175782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-image-import-ca\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.175912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-config\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.176217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81046e41-7512-4b56-b0d7-ccfa783f973a-config\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.176521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-audit-policies\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.176893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91397883-fd02-4070-aa06-18da845bbeeb-serving-cert\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91397883-fd02-4070-aa06-18da845bbeeb-service-ca-bundle\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-etcd-serving-ca\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/384cd184-04e6-4505-b0f3-a2e367bb6dcd-audit-dir\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-serving-cert\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177565 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7qhxj"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gnrhq"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-serving-cert\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.177936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81046e41-7512-4b56-b0d7-ccfa783f973a-auth-proxy-config\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.178003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e80f686a-6a1b-433b-b994-a10eec4758ed-audit\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.178317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-client-ca\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.178449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.178483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-dir\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.178669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.179030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-config\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.179091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.179210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feac3e8b-68a4-4bf4-aabc-e76ef9670361-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c95nl\" (UID: \"feac3e8b-68a4-4bf4-aabc-e76ef9670361\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.179239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/384cd184-04e6-4505-b0f3-a2e367bb6dcd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.179367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.179655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.180322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.180673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24beed91-e86e-4dae-a372-ea06be0cefb9-serving-cert\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.180769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.181271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-etcd-client\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.181648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.181827 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6jdbh"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.181836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-encryption-config\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.181849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.182295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.182856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.183091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81046e41-7512-4b56-b0d7-ccfa783f973a-machine-approver-tls\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.183530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7a4eced-46b2-4002-964d-490b0ad2acd3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.183718 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.184130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e80f686a-6a1b-433b-b994-a10eec4758ed-etcd-client\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.184865 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.184891 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.186074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/384cd184-04e6-4505-b0f3-a2e367bb6dcd-encryption-config\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.186155 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.188326 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gnrhq"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.189423 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8258q"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.190552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.191723 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.192841 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.193887 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r74s4"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.195016 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kjq22"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.196096 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g8pt7"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.197176 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hgb9s"] Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.205757 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.225632 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.245874 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.265229 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.292153 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.305224 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.325668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.344614 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.365627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.385258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.405697 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.424529 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.446351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.465428 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.485911 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.504705 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.525253 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.545077 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.565637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.585078 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.606477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.626116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.682848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7qps\" (UniqueName: \"kubernetes.io/projected/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-kube-api-access-p7qps\") pod \"route-controller-manager-6576b87f9c-nbvhn\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.705158 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.726258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.745602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.765061 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.786105 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.805505 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.825284 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.844919 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.864364 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.881184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.884409 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.905607 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.926879 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.945699 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.965383 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 05:48:22 crc kubenswrapper[4707]: I0218 05:48:22.986379 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.006139 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.025788 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.045941 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.052671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.065648 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.085640 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.103471 4707 request.go:700] Waited for 1.00739896s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.104885 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.107761 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn"] Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.125131 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.144787 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.165433 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.185137 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.205719 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.225132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.245881 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.264824 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.285181 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.305238 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.326224 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.345387 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.355503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" event={"ID":"fd8702f2-1cdf-48fb-ad08-e6f533cc8404","Type":"ContainerStarted","Data":"a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228"} Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.355594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" event={"ID":"fd8702f2-1cdf-48fb-ad08-e6f533cc8404","Type":"ContainerStarted","Data":"f3f9073a48a59b19eacc1d029dda73dd1156128aeb7cff719ca2918f76b60375"} Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.355888 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.358559 4707 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-nbvhn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.358603 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" podUID="fd8702f2-1cdf-48fb-ad08-e6f533cc8404" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.366268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.385379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.406580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.425183 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.446119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.466165 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.485228 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.505008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.526606 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.546252 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.566316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.585889 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.605379 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.625937 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.645028 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.666133 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.690147 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.705902 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.726136 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.746426 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.765024 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.785159 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.804979 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.824871 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.844457 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.869446 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.885996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.905913 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.925715 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.945009 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.966461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 05:48:23 crc kubenswrapper[4707]: I0218 05:48:23.985109 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.006199 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.025478 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.046352 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.066339 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.093970 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.103892 4707 request.go:700] Waited for 1.930268566s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.132990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq6qh\" (UniqueName: \"kubernetes.io/projected/b7a4eced-46b2-4002-964d-490b0ad2acd3-kube-api-access-fq6qh\") pod \"machine-api-operator-5694c8668f-g4hvr\" (UID: \"b7a4eced-46b2-4002-964d-490b0ad2acd3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.143824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.153862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z454t\" (UniqueName: \"kubernetes.io/projected/feac3e8b-68a4-4bf4-aabc-e76ef9670361-kube-api-access-z454t\") pod \"cluster-samples-operator-665b6dd947-c95nl\" (UID: \"feac3e8b-68a4-4bf4-aabc-e76ef9670361\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.170851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9k49\" (UniqueName: \"kubernetes.io/projected/e80f686a-6a1b-433b-b994-a10eec4758ed-kube-api-access-w9k49\") pod \"apiserver-76f77b778f-k2dhh\" (UID: \"e80f686a-6a1b-433b-b994-a10eec4758ed\") " pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.190600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkmr\" (UniqueName: \"kubernetes.io/projected/6eede7e5-9282-4ee7-b3c0-7b81ccc81503-kube-api-access-mmkmr\") pod \"openshift-apiserver-operator-796bbdcf4f-jh8cp\" (UID: \"6eede7e5-9282-4ee7-b3c0-7b81ccc81503\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.194017 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.210870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bcjs\" (UniqueName: \"kubernetes.io/projected/ec70ffb1-f091-47ed-b947-6af13fd6d34f-kube-api-access-7bcjs\") pod \"oauth-openshift-558db77b4-cbn6t\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.211265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.219973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.227440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6z8b\" (UniqueName: \"kubernetes.io/projected/24beed91-e86e-4dae-a372-ea06be0cefb9-kube-api-access-v6z8b\") pod \"controller-manager-879f6c89f-74wxc\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.277428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh85c\" (UniqueName: \"kubernetes.io/projected/81046e41-7512-4b56-b0d7-ccfa783f973a-kube-api-access-dh85c\") pod \"machine-approver-56656f9798-qd7rc\" (UID: \"81046e41-7512-4b56-b0d7-ccfa783f973a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.280993 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.286270 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.299710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krm6t\" (UniqueName: \"kubernetes.io/projected/91397883-fd02-4070-aa06-18da845bbeeb-kube-api-access-krm6t\") pod \"authentication-operator-69f744f599-75w4w\" (UID: \"91397883-fd02-4070-aa06-18da845bbeeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.306170 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.325115 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.338261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.357265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.360047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc2dk\" (UniqueName: \"kubernetes.io/projected/384cd184-04e6-4505-b0f3-a2e367bb6dcd-kube-api-access-qc2dk\") pod \"apiserver-7bbb656c7d-lhz6t\" (UID: \"384cd184-04e6-4505-b0f3-a2e367bb6dcd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.368755 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-metrics-tls\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-bound-sa-token\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-default-certificate\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbfcc\" (UniqueName: \"kubernetes.io/projected/a8657192-49a2-4c45-bc94-bbc3e2e608af-kube-api-access-gbfcc\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-service-ca\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405821 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnh2m\" (UniqueName: \"kubernetes.io/projected/268d3c04-eb32-411a-8032-caa99ca62ade-kube-api-access-hnh2m\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-certificates\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-config\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/de733605-5332-483f-b086-f97251d14bab-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-config\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.405979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de733605-5332-483f-b086-f97251d14bab-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpjm\" (UniqueName: \"kubernetes.io/projected/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-kube-api-access-glpjm\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-stats-auth\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-serving-cert\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-oauth-serving-cert\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c25xt\" (UniqueName: \"kubernetes.io/projected/055d0ae1-3458-4a39-85ae-6880ec2bae14-kube-api-access-c25xt\") pod \"downloads-7954f5f757-f9dnq\" (UID: \"055d0ae1-3458-4a39-85ae-6880ec2bae14\") " pod="openshift-console/downloads-7954f5f757-f9dnq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-oauth-config\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de733605-5332-483f-b086-f97251d14bab-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-trusted-ca\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-service-ca-bundle\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsk6w\" (UniqueName: \"kubernetes.io/projected/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-kube-api-access-rsk6w\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-trusted-ca\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-785km\" (UniqueName: \"kubernetes.io/projected/de733605-5332-483f-b086-f97251d14bab-kube-api-access-785km\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4c25682-b294-4940-be7e-2f5eb7e7366d-metrics-tls\") pod \"dns-operator-744455d44c-8258q\" (UID: \"c4c25682-b294-4940-be7e-2f5eb7e7366d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-trusted-ca-bundle\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.407997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268d3c04-eb32-411a-8032-caa99ca62ade-config-volume\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.408034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mln25\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-kube-api-access-mln25\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.408053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-serving-cert\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.408105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19a7180b-9bd3-4a29-8c77-e385307350cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: E0218 05:48:24.409570 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:24.909555202 +0000 UTC m=+41.557514336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.410134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-serving-cert\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.411386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a7180b-9bd3-4a29-8c77-e385307350cc-config\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.411460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvcs5\" (UniqueName: \"kubernetes.io/projected/c4c25682-b294-4940-be7e-2f5eb7e7366d-kube-api-access-lvcs5\") pod \"dns-operator-744455d44c-8258q\" (UID: \"c4c25682-b294-4940-be7e-2f5eb7e7366d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.411484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9lcx\" (UniqueName: \"kubernetes.io/projected/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-kube-api-access-l9lcx\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.411503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-metrics-certs\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.411928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-tls\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.415076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19a7180b-9bd3-4a29-8c77-e385307350cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.415183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l258m\" (UniqueName: \"kubernetes.io/projected/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-kube-api-access-l258m\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.415210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.415234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-trusted-ca\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.415305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/268d3c04-eb32-411a-8032-caa99ca62ade-metrics-tls\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.430989 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-g4hvr"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.431200 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:24 crc kubenswrapper[4707]: W0218 05:48:24.444689 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a4eced_46b2_4002_964d_490b0ad2acd3.slice/crio-b5749bb185c557026c303d1fd4d5124d8a734887a0c450e3e3dabe5dcaf4c2b5 WatchSource:0}: Error finding container b5749bb185c557026c303d1fd4d5124d8a734887a0c450e3e3dabe5dcaf4c2b5: Status 404 returned error can't find the container with id b5749bb185c557026c303d1fd4d5124d8a734887a0c450e3e3dabe5dcaf4c2b5 Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.453285 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.506309 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.515982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48f33e8e-2190-4402-9a3d-b8f7f3324da4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bb2eb67f-2587-4991-93cd-aeb2eed647e3-signing-cabundle\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-trusted-ca\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bb2eb67f-2587-4991-93cd-aeb2eed647e3-signing-key\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bc0f806-810b-48df-8795-fb4962e906c1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn6td\" (UniqueName: \"kubernetes.io/projected/e09b1e8f-752e-42dc-a638-cc7ac7179f83-kube-api-access-qn6td\") pod \"control-plane-machine-set-operator-78cbb6b69f-tnwns\" (UID: \"e09b1e8f-752e-42dc-a638-cc7ac7179f83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15219006-aa83-4fc4-afed-79a5e7f18eda-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048d19f3-df93-4565-87ea-dd7ce3d9e888-config\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-metrics-tls\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-bound-sa-token\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5566031f-2b93-4c7f-baff-f5929fc609ed-certs\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbqkx\" (UniqueName: \"kubernetes.io/projected/6921a979-b8b5-4664-af9a-c56983db0020-kube-api-access-lbqkx\") pod \"ingress-canary-gnrhq\" (UID: \"6921a979-b8b5-4664-af9a-c56983db0020\") " pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-certificates\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-config\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/de733605-5332-483f-b086-f97251d14bab-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de733605-5332-483f-b086-f97251d14bab-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-proxy-tls\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: E0218 05:48:24.517000 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.016975381 +0000 UTC m=+41.664934515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.518350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-trusted-ca\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.518633 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-certificates\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.519332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de733605-5332-483f-b086-f97251d14bab-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.519629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.516745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4jk\" (UniqueName: \"kubernetes.io/projected/00e150c8-f0e4-4038-8213-cc395d64d48b-kube-api-access-xg4jk\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb9f9f8-7162-479f-a789-dd3e61578ec4-config-volume\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-oauth-serving-cert\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-client\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-ca\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78651c6-6845-4f2a-a2dc-16590108a302-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de733605-5332-483f-b086-f97251d14bab-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-trusted-ca\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e09b1e8f-752e-42dc-a638-cc7ac7179f83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tnwns\" (UID: \"e09b1e8f-752e-42dc-a638-cc7ac7179f83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsk6w\" (UniqueName: \"kubernetes.io/projected/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-kube-api-access-rsk6w\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz924\" (UniqueName: \"kubernetes.io/projected/34a7237c-6b9e-4036-8649-ab89d0ec1893-kube-api-access-qz924\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-trusted-ca\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdb9f9f8-7162-479f-a789-dd3e61578ec4-secret-volume\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mln25\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-kube-api-access-mln25\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-serving-cert\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ccb59e4a-55d3-436a-94c1-e926834470fb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hgb9s\" (UID: \"ccb59e4a-55d3-436a-94c1-e926834470fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgc4n\" (UniqueName: \"kubernetes.io/projected/15219006-aa83-4fc4-afed-79a5e7f18eda-kube-api-access-fgc4n\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-images\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6bc0f806-810b-48df-8795-fb4962e906c1-ready\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvcs5\" (UniqueName: \"kubernetes.io/projected/c4c25682-b294-4940-be7e-2f5eb7e7366d-kube-api-access-lvcs5\") pod \"dns-operator-744455d44c-8258q\" (UID: \"c4c25682-b294-4940-be7e-2f5eb7e7366d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9lcx\" (UniqueName: \"kubernetes.io/projected/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-kube-api-access-l9lcx\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfdj8\" (UniqueName: \"kubernetes.io/projected/52fa1b0d-b1f0-4964-ba77-417233b07f60-kube-api-access-nfdj8\") pod \"package-server-manager-789f6589d5-pc79b\" (UID: \"52fa1b0d-b1f0-4964-ba77-417233b07f60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-srv-cert\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddb09044-1d2b-435d-a424-d22fbac84ca8-proxy-tls\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/34a7237c-6b9e-4036-8649-ab89d0ec1893-srv-cert\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l258m\" (UniqueName: \"kubernetes.io/projected/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-kube-api-access-l258m\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4bm6\" (UniqueName: \"kubernetes.io/projected/fdb9f9f8-7162-479f-a789-dd3e61578ec4-kube-api-access-h4bm6\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.521976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/048d19f3-df93-4565-87ea-dd7ce3d9e888-serving-cert\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.522189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fa1b0d-b1f0-4964-ba77-417233b07f60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-pc79b\" (UID: \"52fa1b0d-b1f0-4964-ba77-417233b07f60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/496fadfa-36c2-47ff-a5b1-a485de9df869-webhook-cert\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdgw\" (UniqueName: \"kubernetes.io/projected/496fadfa-36c2-47ff-a5b1-a485de9df869-kube-api-access-qsdgw\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/268d3c04-eb32-411a-8032-caa99ca62ade-metrics-tls\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kg5v\" (UniqueName: \"kubernetes.io/projected/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-kube-api-access-9kg5v\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78651c6-6845-4f2a-a2dc-16590108a302-config\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-socket-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-mountpoint-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-default-certificate\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbfcc\" (UniqueName: \"kubernetes.io/projected/a8657192-49a2-4c45-bc94-bbc3e2e608af-kube-api-access-gbfcc\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlgjd\" (UniqueName: \"kubernetes.io/projected/bb2eb67f-2587-4991-93cd-aeb2eed647e3-kube-api-access-dlgjd\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-service-ca\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnh2m\" (UniqueName: \"kubernetes.io/projected/268d3c04-eb32-411a-8032-caa99ca62ade-kube-api-access-hnh2m\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/496fadfa-36c2-47ff-a5b1-a485de9df869-tmpfs\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-config\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbh2\" (UniqueName: \"kubernetes.io/projected/5566031f-2b93-4c7f-baff-f5929fc609ed-kube-api-access-fsbh2\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.523749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-config\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.524040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-oauth-serving-cert\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.524050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-config\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glpjm\" (UniqueName: \"kubernetes.io/projected/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-kube-api-access-glpjm\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-stats-auth\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-registration-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-plugins-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnqvs\" (UniqueName: \"kubernetes.io/projected/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-kube-api-access-qnqvs\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f33e8e-2190-4402-9a3d-b8f7f3324da4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/496fadfa-36c2-47ff-a5b1-a485de9df869-apiservice-cert\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-serving-cert\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5566031f-2b93-4c7f-baff-f5929fc609ed-node-bootstrap-token\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c25xt\" (UniqueName: \"kubernetes.io/projected/055d0ae1-3458-4a39-85ae-6880ec2bae14-kube-api-access-c25xt\") pod \"downloads-7954f5f757-f9dnq\" (UID: \"055d0ae1-3458-4a39-85ae-6880ec2bae14\") " pod="openshift-console/downloads-7954f5f757-f9dnq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psdmw\" (UniqueName: \"kubernetes.io/projected/abbe64a9-dc0c-4b1b-931e-dddd926a91c8-kube-api-access-psdmw\") pod \"migrator-59844c95c7-5sgn9\" (UID: \"abbe64a9-dc0c-4b1b-931e-dddd926a91c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.525783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48f33e8e-2190-4402-9a3d-b8f7f3324da4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.526759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-config\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.527223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-metrics-tls\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.527425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-trusted-ca\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: E0218 05:48:24.527680 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.026017549 +0000 UTC m=+41.673976683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.527758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.527933 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-trusted-ca\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.528588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-default-certificate\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-oauth-config\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx85l\" (UniqueName: \"kubernetes.io/projected/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-kube-api-access-mx85l\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddb09044-1d2b-435d-a424-d22fbac84ca8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529407 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-service-ca-bundle\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd648\" (UniqueName: \"kubernetes.io/projected/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-kube-api-access-kd648\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-785km\" (UniqueName: \"kubernetes.io/projected/de733605-5332-483f-b086-f97251d14bab-kube-api-access-785km\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4c25682-b294-4940-be7e-2f5eb7e7366d-metrics-tls\") pod \"dns-operator-744455d44c-8258q\" (UID: \"c4c25682-b294-4940-be7e-2f5eb7e7366d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.529977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bc0f806-810b-48df-8795-fb4962e906c1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-service-ca-bundle\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-csi-data-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/34a7237c-6b9e-4036-8649-ab89d0ec1893-profile-collector-cert\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-trusted-ca-bundle\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268d3c04-eb32-411a-8032-caa99ca62ade-config-volume\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15219006-aa83-4fc4-afed-79a5e7f18eda-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bb7\" (UniqueName: \"kubernetes.io/projected/ddb09044-1d2b-435d-a424-d22fbac84ca8-kube-api-access-x7bb7\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19a7180b-9bd3-4a29-8c77-e385307350cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q589n\" (UniqueName: \"kubernetes.io/projected/ccb59e4a-55d3-436a-94c1-e926834470fb-kube-api-access-q589n\") pod \"multus-admission-controller-857f4d67dd-hgb9s\" (UID: \"ccb59e4a-55d3-436a-94c1-e926834470fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-serving-cert\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78651c6-6845-4f2a-a2dc-16590108a302-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00e150c8-f0e4-4038-8213-cc395d64d48b-serving-cert\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmsn\" (UniqueName: \"kubernetes.io/projected/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-kube-api-access-ztmsn\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a7180b-9bd3-4a29-8c77-e385307350cc-config\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-service-ca\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6921a979-b8b5-4664-af9a-c56983db0020-cert\") pod \"ingress-canary-gnrhq\" (UID: \"6921a979-b8b5-4664-af9a-c56983db0020\") " pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mchtc\" (UniqueName: \"kubernetes.io/projected/6bc0f806-810b-48df-8795-fb4962e906c1-kube-api-access-mchtc\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-metrics-certs\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzwpt\" (UniqueName: \"kubernetes.io/projected/048d19f3-df93-4565-87ea-dd7ce3d9e888-kube-api-access-zzwpt\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-tls\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.530672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19a7180b-9bd3-4a29-8c77-e385307350cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.533018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-service-ca\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.533484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/de733605-5332-483f-b086-f97251d14bab-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.533914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-trusted-ca-bundle\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.534165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268d3c04-eb32-411a-8032-caa99ca62ade-config-volume\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.534844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-oauth-config\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.535017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a7180b-9bd3-4a29-8c77-e385307350cc-config\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.535611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-serving-cert\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.537721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.538309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-stats-auth\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.539764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.540020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19a7180b-9bd3-4a29-8c77-e385307350cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.540235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/268d3c04-eb32-411a-8032-caa99ca62ade-metrics-tls\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.540949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-tls\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.543154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-serving-cert\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.544496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4c25682-b294-4940-be7e-2f5eb7e7366d-metrics-tls\") pod \"dns-operator-744455d44c-8258q\" (UID: \"c4c25682-b294-4940-be7e-2f5eb7e7366d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.554615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-metrics-certs\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.554767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-bound-sa-token\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.556393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-serving-cert\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.564703 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.569845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de733605-5332-483f-b086-f97251d14bab-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.619358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mln25\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-kube-api-access-mln25\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.622888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvcs5\" (UniqueName: \"kubernetes.io/projected/c4c25682-b294-4940-be7e-2f5eb7e7366d-kube-api-access-lvcs5\") pod \"dns-operator-744455d44c-8258q\" (UID: \"c4c25682-b294-4940-be7e-2f5eb7e7366d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.624649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l258m\" (UniqueName: \"kubernetes.io/projected/76d59c18-7131-4e63-bb20-3ae0e2ac8edb-kube-api-access-l258m\") pod \"openshift-config-operator-7777fb866f-wd4tk\" (UID: \"76d59c18-7131-4e63-bb20-3ae0e2ac8edb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-client\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-ca\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78651c6-6845-4f2a-a2dc-16590108a302-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e09b1e8f-752e-42dc-a638-cc7ac7179f83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tnwns\" (UID: \"e09b1e8f-752e-42dc-a638-cc7ac7179f83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz924\" (UniqueName: \"kubernetes.io/projected/34a7237c-6b9e-4036-8649-ab89d0ec1893-kube-api-access-qz924\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdb9f9f8-7162-479f-a789-dd3e61578ec4-secret-volume\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgc4n\" (UniqueName: \"kubernetes.io/projected/15219006-aa83-4fc4-afed-79a5e7f18eda-kube-api-access-fgc4n\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.632988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ccb59e4a-55d3-436a-94c1-e926834470fb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hgb9s\" (UID: \"ccb59e4a-55d3-436a-94c1-e926834470fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-images\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6bc0f806-810b-48df-8795-fb4962e906c1-ready\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfdj8\" (UniqueName: \"kubernetes.io/projected/52fa1b0d-b1f0-4964-ba77-417233b07f60-kube-api-access-nfdj8\") pod \"package-server-manager-789f6589d5-pc79b\" (UID: \"52fa1b0d-b1f0-4964-ba77-417233b07f60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-srv-cert\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddb09044-1d2b-435d-a424-d22fbac84ca8-proxy-tls\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/34a7237c-6b9e-4036-8649-ab89d0ec1893-srv-cert\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4bm6\" (UniqueName: \"kubernetes.io/projected/fdb9f9f8-7162-479f-a789-dd3e61578ec4-kube-api-access-h4bm6\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/048d19f3-df93-4565-87ea-dd7ce3d9e888-serving-cert\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fa1b0d-b1f0-4964-ba77-417233b07f60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-pc79b\" (UID: \"52fa1b0d-b1f0-4964-ba77-417233b07f60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/496fadfa-36c2-47ff-a5b1-a485de9df869-webhook-cert\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdgw\" (UniqueName: \"kubernetes.io/projected/496fadfa-36c2-47ff-a5b1-a485de9df869-kube-api-access-qsdgw\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kg5v\" (UniqueName: \"kubernetes.io/projected/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-kube-api-access-9kg5v\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78651c6-6845-4f2a-a2dc-16590108a302-config\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-socket-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-mountpoint-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlgjd\" (UniqueName: \"kubernetes.io/projected/bb2eb67f-2587-4991-93cd-aeb2eed647e3-kube-api-access-dlgjd\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/496fadfa-36c2-47ff-a5b1-a485de9df869-tmpfs\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbh2\" (UniqueName: \"kubernetes.io/projected/5566031f-2b93-4c7f-baff-f5929fc609ed-kube-api-access-fsbh2\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-registration-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-plugins-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnqvs\" (UniqueName: \"kubernetes.io/projected/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-kube-api-access-qnqvs\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f33e8e-2190-4402-9a3d-b8f7f3324da4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/496fadfa-36c2-47ff-a5b1-a485de9df869-apiservice-cert\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5566031f-2b93-4c7f-baff-f5929fc609ed-node-bootstrap-token\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psdmw\" (UniqueName: \"kubernetes.io/projected/abbe64a9-dc0c-4b1b-931e-dddd926a91c8-kube-api-access-psdmw\") pod \"migrator-59844c95c7-5sgn9\" (UID: \"abbe64a9-dc0c-4b1b-931e-dddd926a91c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48f33e8e-2190-4402-9a3d-b8f7f3324da4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx85l\" (UniqueName: \"kubernetes.io/projected/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-kube-api-access-mx85l\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633616 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddb09044-1d2b-435d-a424-d22fbac84ca8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd648\" (UniqueName: \"kubernetes.io/projected/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-kube-api-access-kd648\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bc0f806-810b-48df-8795-fb4962e906c1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-csi-data-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/34a7237c-6b9e-4036-8649-ab89d0ec1893-profile-collector-cert\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15219006-aa83-4fc4-afed-79a5e7f18eda-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bb7\" (UniqueName: \"kubernetes.io/projected/ddb09044-1d2b-435d-a424-d22fbac84ca8-kube-api-access-x7bb7\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q589n\" (UniqueName: \"kubernetes.io/projected/ccb59e4a-55d3-436a-94c1-e926834470fb-kube-api-access-q589n\") pod \"multus-admission-controller-857f4d67dd-hgb9s\" (UID: \"ccb59e4a-55d3-436a-94c1-e926834470fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633818 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78651c6-6845-4f2a-a2dc-16590108a302-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00e150c8-f0e4-4038-8213-cc395d64d48b-serving-cert\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmsn\" (UniqueName: \"kubernetes.io/projected/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-kube-api-access-ztmsn\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.633985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mchtc\" (UniqueName: \"kubernetes.io/projected/6bc0f806-810b-48df-8795-fb4962e906c1-kube-api-access-mchtc\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-service-ca\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6921a979-b8b5-4664-af9a-c56983db0020-cert\") pod \"ingress-canary-gnrhq\" (UID: \"6921a979-b8b5-4664-af9a-c56983db0020\") " pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzwpt\" (UniqueName: \"kubernetes.io/projected/048d19f3-df93-4565-87ea-dd7ce3d9e888-kube-api-access-zzwpt\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48f33e8e-2190-4402-9a3d-b8f7f3324da4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bb2eb67f-2587-4991-93cd-aeb2eed647e3-signing-cabundle\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bb2eb67f-2587-4991-93cd-aeb2eed647e3-signing-key\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bc0f806-810b-48df-8795-fb4962e906c1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn6td\" (UniqueName: \"kubernetes.io/projected/e09b1e8f-752e-42dc-a638-cc7ac7179f83-kube-api-access-qn6td\") pod \"control-plane-machine-set-operator-78cbb6b69f-tnwns\" (UID: \"e09b1e8f-752e-42dc-a638-cc7ac7179f83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15219006-aa83-4fc4-afed-79a5e7f18eda-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048d19f3-df93-4565-87ea-dd7ce3d9e888-config\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: E0218 05:48:24.634257 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.134240249 +0000 UTC m=+41.782199383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.634532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78651c6-6845-4f2a-a2dc-16590108a302-config\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.636170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-ca\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.637549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-socket-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.637591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-mountpoint-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.637962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/496fadfa-36c2-47ff-a5b1-a485de9df869-tmpfs\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.638058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-registration-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.638086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-plugins-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.638635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f33e8e-2190-4402-9a3d-b8f7f3324da4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.639079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-csi-data-dir\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.639568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-client\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.639708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fa1b0d-b1f0-4964-ba77-417233b07f60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-pc79b\" (UID: \"52fa1b0d-b1f0-4964-ba77-417233b07f60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.640612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bc0f806-810b-48df-8795-fb4962e906c1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.640860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048d19f3-df93-4565-87ea-dd7ce3d9e888-config\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6bc0f806-810b-48df-8795-fb4962e906c1-ready\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5566031f-2b93-4c7f-baff-f5929fc609ed-certs\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbqkx\" (UniqueName: \"kubernetes.io/projected/6921a979-b8b5-4664-af9a-c56983db0020-kube-api-access-lbqkx\") pod \"ingress-canary-gnrhq\" (UID: \"6921a979-b8b5-4664-af9a-c56983db0020\") " pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-config\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-proxy-tls\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4jk\" (UniqueName: \"kubernetes.io/projected/00e150c8-f0e4-4038-8213-cc395d64d48b-kube-api-access-xg4jk\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb9f9f8-7162-479f-a789-dd3e61578ec4-config-volume\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-etcd-service-ca\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.641835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/496fadfa-36c2-47ff-a5b1-a485de9df869-apiservice-cert\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.642574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.642637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb9f9f8-7162-479f-a789-dd3e61578ec4-config-volume\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.645743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15219006-aa83-4fc4-afed-79a5e7f18eda-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.645837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bc0f806-810b-48df-8795-fb4962e906c1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.648983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e150c8-f0e4-4038-8213-cc395d64d48b-config\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.649163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bb2eb67f-2587-4991-93cd-aeb2eed647e3-signing-cabundle\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.650014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5566031f-2b93-4c7f-baff-f5929fc609ed-node-bootstrap-token\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.650656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/34a7237c-6b9e-4036-8649-ab89d0ec1893-profile-collector-cert\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.654399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.657595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ddb09044-1d2b-435d-a424-d22fbac84ca8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.657817 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.658330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/34a7237c-6b9e-4036-8649-ab89d0ec1893-srv-cert\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.658701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48f33e8e-2190-4402-9a3d-b8f7f3324da4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.659493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-images\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.660628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5566031f-2b93-4c7f-baff-f5929fc609ed-certs\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.661560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00e150c8-f0e4-4038-8213-cc395d64d48b-serving-cert\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.663179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddb09044-1d2b-435d-a424-d22fbac84ca8-proxy-tls\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.663345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.663557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/048d19f3-df93-4565-87ea-dd7ce3d9e888-serving-cert\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.664650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-srv-cert\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.670343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.671278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bb2eb67f-2587-4991-93cd-aeb2eed647e3-signing-key\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.671616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsk6w\" (UniqueName: \"kubernetes.io/projected/e1a6c244-2da0-4dc4-8086-b3a725e8b24b-kube-api-access-rsk6w\") pod \"router-default-5444994796-b87f8\" (UID: \"e1a6c244-2da0-4dc4-8086-b3a725e8b24b\") " pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.671874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15219006-aa83-4fc4-afed-79a5e7f18eda-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.672231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/496fadfa-36c2-47ff-a5b1-a485de9df869-webhook-cert\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.672667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-proxy-tls\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.673862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cbn6t"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.679650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdb9f9f8-7162-479f-a789-dd3e61578ec4-secret-volume\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.680191 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-74wxc"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.681465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78651c6-6845-4f2a-a2dc-16590108a302-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.693236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6921a979-b8b5-4664-af9a-c56983db0020-cert\") pod \"ingress-canary-gnrhq\" (UID: \"6921a979-b8b5-4664-af9a-c56983db0020\") " pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.694008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.696012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.698591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbfcc\" (UniqueName: \"kubernetes.io/projected/a8657192-49a2-4c45-bc94-bbc3e2e608af-kube-api-access-gbfcc\") pod \"console-f9d7485db-m2l7m\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.698611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k2dhh"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.715824 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e09b1e8f-752e-42dc-a638-cc7ac7179f83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tnwns\" (UID: \"e09b1e8f-752e-42dc-a638-cc7ac7179f83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.716023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ccb59e4a-55d3-436a-94c1-e926834470fb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hgb9s\" (UID: \"ccb59e4a-55d3-436a-94c1-e926834470fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:24 crc kubenswrapper[4707]: W0218 05:48:24.718376 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec70ffb1_f091_47ed_b947_6af13fd6d34f.slice/crio-1d5ac31ec1eb769cca3f84edeef693be6d718b79935d903df69a6bfda6e0c3ff WatchSource:0}: Error finding container 1d5ac31ec1eb769cca3f84edeef693be6d718b79935d903df69a6bfda6e0c3ff: Status 404 returned error can't find the container with id 1d5ac31ec1eb769cca3f84edeef693be6d718b79935d903df69a6bfda6e0c3ff Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.719815 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpjm\" (UniqueName: \"kubernetes.io/projected/0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e-kube-api-access-glpjm\") pod \"ingress-operator-5b745b69d9-qf9jw\" (UID: \"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.725365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnh2m\" (UniqueName: \"kubernetes.io/projected/268d3c04-eb32-411a-8032-caa99ca62ade-kube-api-access-hnh2m\") pod \"dns-default-7f42d\" (UID: \"268d3c04-eb32-411a-8032-caa99ca62ade\") " pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.741920 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9lcx\" (UniqueName: \"kubernetes.io/projected/4e1b6c8f-528f-41ca-bcc7-459cd6da12b6-kube-api-access-l9lcx\") pod \"console-operator-58897d9998-sv9z4\" (UID: \"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6\") " pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.743057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: E0218 05:48:24.743478 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.243464955 +0000 UTC m=+41.891424089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.765765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c25xt\" (UniqueName: \"kubernetes.io/projected/055d0ae1-3458-4a39-85ae-6880ec2bae14-kube-api-access-c25xt\") pod \"downloads-7954f5f757-f9dnq\" (UID: \"055d0ae1-3458-4a39-85ae-6880ec2bae14\") " pod="openshift-console/downloads-7954f5f757-f9dnq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.769023 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.783181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-785km\" (UniqueName: \"kubernetes.io/projected/de733605-5332-483f-b086-f97251d14bab-kube-api-access-785km\") pod \"cluster-image-registry-operator-dc59b4c8b-7b2sl\" (UID: \"de733605-5332-483f-b086-f97251d14bab\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: W0218 05:48:24.797880 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod384cd184_04e6_4505_b0f3_a2e367bb6dcd.slice/crio-f4522d7ef0694942df1fcf1612fdfdac18d1e095f54aaf89798fefbf18138455 WatchSource:0}: Error finding container f4522d7ef0694942df1fcf1612fdfdac18d1e095f54aaf89798fefbf18138455: Status 404 returned error can't find the container with id f4522d7ef0694942df1fcf1612fdfdac18d1e095f54aaf89798fefbf18138455 Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.804815 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19a7180b-9bd3-4a29-8c77-e385307350cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-68b78\" (UID: \"19a7180b-9bd3-4a29-8c77-e385307350cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.809261 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75w4w"] Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.827377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.831598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.838397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f9dnq" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.844760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:24 crc kubenswrapper[4707]: E0218 05:48:24.845353 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.345322568 +0000 UTC m=+41.993281702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.846705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfdj8\" (UniqueName: \"kubernetes.io/projected/52fa1b0d-b1f0-4964-ba77-417233b07f60-kube-api-access-nfdj8\") pod \"package-server-manager-789f6589d5-pc79b\" (UID: \"52fa1b0d-b1f0-4964-ba77-417233b07f60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.854674 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.856843 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.863559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.869546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmsn\" (UniqueName: \"kubernetes.io/projected/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-kube-api-access-ztmsn\") pod \"marketplace-operator-79b997595-g8pt7\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.874138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8258q" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.886463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlgjd\" (UniqueName: \"kubernetes.io/projected/bb2eb67f-2587-4991-93cd-aeb2eed647e3-kube-api-access-dlgjd\") pod \"service-ca-9c57cc56f-6jdbh\" (UID: \"bb2eb67f-2587-4991-93cd-aeb2eed647e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.900840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn6td\" (UniqueName: \"kubernetes.io/projected/e09b1e8f-752e-42dc-a638-cc7ac7179f83-kube-api-access-qn6td\") pod \"control-plane-machine-set-operator-78cbb6b69f-tnwns\" (UID: \"e09b1e8f-752e-42dc-a638-cc7ac7179f83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.908966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.916756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.920092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbh2\" (UniqueName: \"kubernetes.io/projected/5566031f-2b93-4c7f-baff-f5929fc609ed-kube-api-access-fsbh2\") pod \"machine-config-server-7g9qp\" (UID: \"5566031f-2b93-4c7f-baff-f5929fc609ed\") " pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:24 crc kubenswrapper[4707]: W0218 05:48:24.924295 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91397883_fd02_4070_aa06_18da845bbeeb.slice/crio-416594fa9367d972bb39786192da1f7e644ed017da0fcc6b0557c93c292b5ac2 WatchSource:0}: Error finding container 416594fa9367d972bb39786192da1f7e644ed017da0fcc6b0557c93c292b5ac2: Status 404 returned error can't find the container with id 416594fa9367d972bb39786192da1f7e644ed017da0fcc6b0557c93c292b5ac2 Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.929413 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.943734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnqvs\" (UniqueName: \"kubernetes.io/projected/8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2-kube-api-access-qnqvs\") pod \"olm-operator-6b444d44fb-pvdd9\" (UID: \"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.946416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.946509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:24 crc kubenswrapper[4707]: E0218 05:48:24.947266 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.447210851 +0000 UTC m=+42.095169985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.952071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95bdd5db-88ec-41b6-9752-b5646a64f1ae-metrics-certs\") pod \"network-metrics-daemon-snscc\" (UID: \"95bdd5db-88ec-41b6-9752-b5646a64f1ae\") " pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.952379 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snscc" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.960960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bb7\" (UniqueName: \"kubernetes.io/projected/ddb09044-1d2b-435d-a424-d22fbac84ca8-kube-api-access-x7bb7\") pod \"machine-config-controller-84d6567774-lhw9r\" (UID: \"ddb09044-1d2b-435d-a424-d22fbac84ca8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.979385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q589n\" (UniqueName: \"kubernetes.io/projected/ccb59e4a-55d3-436a-94c1-e926834470fb-kube-api-access-q589n\") pod \"multus-admission-controller-857f4d67dd-hgb9s\" (UID: \"ccb59e4a-55d3-436a-94c1-e926834470fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:24 crc kubenswrapper[4707]: I0218 05:48:24.989959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.002882 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78651c6-6845-4f2a-a2dc-16590108a302-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tp7zn\" (UID: \"e78651c6-6845-4f2a-a2dc-16590108a302\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.019656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4bm6\" (UniqueName: \"kubernetes.io/projected/fdb9f9f8-7162-479f-a789-dd3e61578ec4-kube-api-access-h4bm6\") pod \"collect-profiles-29523225-rdxbl\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.029014 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.044125 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.044344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzwpt\" (UniqueName: \"kubernetes.io/projected/048d19f3-df93-4565-87ea-dd7ce3d9e888-kube-api-access-zzwpt\") pod \"service-ca-operator-777779d784-vx8tz\" (UID: \"048d19f3-df93-4565-87ea-dd7ce3d9e888\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.047859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.048399 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.548383716 +0000 UTC m=+42.196342850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.053389 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.058032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.068894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48f33e8e-2190-4402-9a3d-b8f7f3324da4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m7vfl\" (UID: \"48f33e8e-2190-4402-9a3d-b8f7f3324da4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.071669 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.079418 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.080440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kg5v\" (UniqueName: \"kubernetes.io/projected/abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3-kube-api-access-9kg5v\") pod \"machine-config-operator-74547568cd-77bg6\" (UID: \"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.085652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.093344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7g9qp" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.100212 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.117809 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbqkx\" (UniqueName: \"kubernetes.io/projected/6921a979-b8b5-4664-af9a-c56983db0020-kube-api-access-lbqkx\") pod \"ingress-canary-gnrhq\" (UID: \"6921a979-b8b5-4664-af9a-c56983db0020\") " pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.127365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd648\" (UniqueName: \"kubernetes.io/projected/c1888d8d-c40a-4ee1-a697-6f5e97bf657d-kube-api-access-kd648\") pod \"openshift-controller-manager-operator-756b6f6bc6-l2jp2\" (UID: \"c1888d8d-c40a-4ee1-a697-6f5e97bf657d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.131361 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.143690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4jk\" (UniqueName: \"kubernetes.io/projected/00e150c8-f0e4-4038-8213-cc395d64d48b-kube-api-access-xg4jk\") pod \"etcd-operator-b45778765-7qhxj\" (UID: \"00e150c8-f0e4-4038-8213-cc395d64d48b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.153045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gnrhq" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.153359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.153911 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.653891485 +0000 UTC m=+42.301850629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.172081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx85l\" (UniqueName: \"kubernetes.io/projected/fad3b2dc-a4ef-4a58-b382-993b82f4fcbc-kube-api-access-mx85l\") pod \"csi-hostpathplugin-kjq22\" (UID: \"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc\") " pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.174748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sv9z4"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.184503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgc4n\" (UniqueName: \"kubernetes.io/projected/15219006-aa83-4fc4-afed-79a5e7f18eda-kube-api-access-fgc4n\") pod \"kube-storage-version-migrator-operator-b67b599dd-fp5dm\" (UID: \"15219006-aa83-4fc4-afed-79a5e7f18eda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.202654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz924\" (UniqueName: \"kubernetes.io/projected/34a7237c-6b9e-4036-8649-ab89d0ec1893-kube-api-access-qz924\") pod \"catalog-operator-68c6474976-4sp72\" (UID: \"34a7237c-6b9e-4036-8649-ab89d0ec1893\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.227746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psdmw\" (UniqueName: \"kubernetes.io/projected/abbe64a9-dc0c-4b1b-931e-dddd926a91c8-kube-api-access-psdmw\") pod \"migrator-59844c95c7-5sgn9\" (UID: \"abbe64a9-dc0c-4b1b-931e-dddd926a91c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.254639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.254861 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.754830403 +0000 UTC m=+42.402789537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.257599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mchtc\" (UniqueName: \"kubernetes.io/projected/6bc0f806-810b-48df-8795-fb4962e906c1-kube-api-access-mchtc\") pod \"cni-sysctl-allowlist-ds-879zs\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.259765 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.261579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.261915 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78"] Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.262423 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.762385222 +0000 UTC m=+42.410344356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.274242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdgw\" (UniqueName: \"kubernetes.io/projected/496fadfa-36c2-47ff-a5b1-a485de9df869-kube-api-access-qsdgw\") pod \"packageserver-d55dfcdfc-j954g\" (UID: \"496fadfa-36c2-47ff-a5b1-a485de9df869\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.274607 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" Feb 18 05:48:25 crc kubenswrapper[4707]: W0218 05:48:25.276937 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a6c244_2da0_4dc4_8086_b3a725e8b24b.slice/crio-1c9c5a9e38116a653166c4a627e0c531505526e5fa771e9d8c854a4f94059ad2 WatchSource:0}: Error finding container 1c9c5a9e38116a653166c4a627e0c531505526e5fa771e9d8c854a4f94059ad2: Status 404 returned error can't find the container with id 1c9c5a9e38116a653166c4a627e0c531505526e5fa771e9d8c854a4f94059ad2 Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.290883 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.297709 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.328941 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.329034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.330192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.337063 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.363458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.363876 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.863860855 +0000 UTC m=+42.511819989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.365207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.419253 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.435095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" event={"ID":"24beed91-e86e-4dae-a372-ea06be0cefb9","Type":"ContainerStarted","Data":"eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.435179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" event={"ID":"24beed91-e86e-4dae-a372-ea06be0cefb9","Type":"ContainerStarted","Data":"2ba40bd5c2d03883067411fa0a2a7b1eec4171f5681b028346eb84f13d223352"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.435754 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.457009 4707 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-74wxc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.457081 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" podUID="24beed91-e86e-4dae-a372-ea06be0cefb9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.462916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" event={"ID":"feac3e8b-68a4-4bf4-aabc-e76ef9670361","Type":"ContainerStarted","Data":"9cfa2ea54f38f3d228ffc09a113c93cd7188ea361756429b89f4bbce1ef9affd"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.462996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" event={"ID":"feac3e8b-68a4-4bf4-aabc-e76ef9670361","Type":"ContainerStarted","Data":"e9e08cf2143c1bd24e9b569c6c0d0f121eb62a5108c321ae074d9b0df8da4fa9"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.470173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.470733 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:25.970717158 +0000 UTC m=+42.618676292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.480270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" event={"ID":"81046e41-7512-4b56-b0d7-ccfa783f973a","Type":"ContainerStarted","Data":"ddb28bd6cb27d402a4ee997c3e6250a555dcf2157a85b5ad5d00c3600c3a11ca"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.480332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" event={"ID":"81046e41-7512-4b56-b0d7-ccfa783f973a","Type":"ContainerStarted","Data":"7c6c8927cd0119ecc210354d7c9891981827b6cf9cd2c197149d2a3b9aa615bb"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.486091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-b87f8" event={"ID":"e1a6c244-2da0-4dc4-8086-b3a725e8b24b","Type":"ContainerStarted","Data":"1c9c5a9e38116a653166c4a627e0c531505526e5fa771e9d8c854a4f94059ad2"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.496830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" event={"ID":"6eede7e5-9282-4ee7-b3c0-7b81ccc81503","Type":"ContainerStarted","Data":"7f5b10c9287a7ebf94a167f088c130133d7cbc2a849befd2d6826c3d9bf260ca"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.496871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" event={"ID":"6eede7e5-9282-4ee7-b3c0-7b81ccc81503","Type":"ContainerStarted","Data":"86371190b75a6f9d84930464b4152a25248a7fcb86ff11a2fc88663795caade1"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.510984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" event={"ID":"b7a4eced-46b2-4002-964d-490b0ad2acd3","Type":"ContainerStarted","Data":"0c008191a954fb743bb114a025529771dbc679bfb5f4b6b544ae1396933d256b"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.511042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" event={"ID":"b7a4eced-46b2-4002-964d-490b0ad2acd3","Type":"ContainerStarted","Data":"1fe7c7e6050ef51bb5ef3ba3e013bd56d7c95d56d1cbe63d7af507f4e3cba43d"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.511073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" event={"ID":"b7a4eced-46b2-4002-964d-490b0ad2acd3","Type":"ContainerStarted","Data":"b5749bb185c557026c303d1fd4d5124d8a734887a0c450e3e3dabe5dcaf4c2b5"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.512548 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" event={"ID":"91397883-fd02-4070-aa06-18da845bbeeb","Type":"ContainerStarted","Data":"416594fa9367d972bb39786192da1f7e644ed017da0fcc6b0557c93c292b5ac2"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.517654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" event={"ID":"384cd184-04e6-4505-b0f3-a2e367bb6dcd","Type":"ContainerStarted","Data":"f4522d7ef0694942df1fcf1612fdfdac18d1e095f54aaf89798fefbf18138455"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.529103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" event={"ID":"ec70ffb1-f091-47ed-b947-6af13fd6d34f","Type":"ContainerStarted","Data":"ee367f57fe225be18c2d2923000ffd6d9b1f3315dc09041d9e525b02ec090edf"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.529178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" event={"ID":"ec70ffb1-f091-47ed-b947-6af13fd6d34f","Type":"ContainerStarted","Data":"1d5ac31ec1eb769cca3f84edeef693be6d718b79935d903df69a6bfda6e0c3ff"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.529676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.533376 4707 generic.go:334] "Generic (PLEG): container finished" podID="e80f686a-6a1b-433b-b994-a10eec4758ed" containerID="5b8ac8be8b81c56caf84414db14b2ce5c3a674aa48d6dfa1336155585864aff1" exitCode=0 Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.534411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" event={"ID":"e80f686a-6a1b-433b-b994-a10eec4758ed","Type":"ContainerDied","Data":"5b8ac8be8b81c56caf84414db14b2ce5c3a674aa48d6dfa1336155585864aff1"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.534456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" event={"ID":"e80f686a-6a1b-433b-b994-a10eec4758ed","Type":"ContainerStarted","Data":"5ebc09174ccb2778bf6626ca881adb3bf080d894b601003e527fb3e55fd77023"} Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.540802 4707 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-cbn6t container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.541187 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" podUID="ec70ffb1-f091-47ed-b947-6af13fd6d34f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.571701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.572823 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.072762846 +0000 UTC m=+42.720721980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.632158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.676870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.681767 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.181748426 +0000 UTC m=+42.829707560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.718585 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7f42d"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.761978 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" podStartSLOduration=16.761953788 podStartE2EDuration="16.761953788s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:25.760726706 +0000 UTC m=+42.408685840" watchObservedRunningTime="2026-02-18 05:48:25.761953788 +0000 UTC m=+42.409912922" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.778912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.779214 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.279186682 +0000 UTC m=+42.927145816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.779278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.779711 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.279704626 +0000 UTC m=+42.927663750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: W0218 05:48:25.804933 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0efe17cc_5e27_4a2f_80b2_1a4c7df15f7e.slice/crio-9fc979e4412fd3cf3c792a2d0af1ca2c3084d53acd3e2015c20e07731d37f909 WatchSource:0}: Error finding container 9fc979e4412fd3cf3c792a2d0af1ca2c3084d53acd3e2015c20e07731d37f909: Status 404 returned error can't find the container with id 9fc979e4412fd3cf3c792a2d0af1ca2c3084d53acd3e2015c20e07731d37f909 Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.809703 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.814778 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8258q"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.831551 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m2l7m"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.845895 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f9dnq"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.860380 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6jdbh"] Feb 18 05:48:25 crc kubenswrapper[4707]: W0218 05:48:25.867388 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc0f806_810b_48df_8795_fb4962e906c1.slice/crio-8b02539ea7924ded5ecbb0291626a60e8c3fa869bbd6442d7d0a1e1c8a32830f WatchSource:0}: Error finding container 8b02539ea7924ded5ecbb0291626a60e8c3fa869bbd6442d7d0a1e1c8a32830f: Status 404 returned error can't find the container with id 8b02539ea7924ded5ecbb0291626a60e8c3fa869bbd6442d7d0a1e1c8a32830f Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.880420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.880913 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.380887141 +0000 UTC m=+43.028846275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.882756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.883256 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.383231203 +0000 UTC m=+43.031190337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: W0218 05:48:25.890745 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod268d3c04_eb32_411a_8032_caa99ca62ade.slice/crio-a2b52b3e6f5ad2944009eac9d6be9ef02b03f1b135568bcfb42e3b95107b1def WatchSource:0}: Error finding container a2b52b3e6f5ad2944009eac9d6be9ef02b03f1b135568bcfb42e3b95107b1def: Status 404 returned error can't find the container with id a2b52b3e6f5ad2944009eac9d6be9ef02b03f1b135568bcfb42e3b95107b1def Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.957427 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" podStartSLOduration=16.957412385 podStartE2EDuration="16.957412385s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:25.955387742 +0000 UTC m=+42.603346866" watchObservedRunningTime="2026-02-18 05:48:25.957412385 +0000 UTC m=+42.605371519" Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.959988 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.976546 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-snscc"] Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.983688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.983973 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.483945604 +0000 UTC m=+43.131904738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:25 crc kubenswrapper[4707]: I0218 05:48:25.984195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:25 crc kubenswrapper[4707]: E0218 05:48:25.984720 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.484710815 +0000 UTC m=+43.132669949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.086959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.087666 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.587642506 +0000 UTC m=+43.235601640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.089850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.090219 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.590203273 +0000 UTC m=+43.238162407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: W0218 05:48:26.109560 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d59c18_7131_4e63_bb20_3ae0e2ac8edb.slice/crio-0a54e7a7769f4dff76a7e13c0376e194100f0fb47c2aa2b09a74876491eab1ac WatchSource:0}: Error finding container 0a54e7a7769f4dff76a7e13c0376e194100f0fb47c2aa2b09a74876491eab1ac: Status 404 returned error can't find the container with id 0a54e7a7769f4dff76a7e13c0376e194100f0fb47c2aa2b09a74876491eab1ac Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.191613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.192022 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.691988213 +0000 UTC m=+43.339947347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.192180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.192550 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.692538407 +0000 UTC m=+43.340497541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.295258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.295763 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.795742616 +0000 UTC m=+43.443701750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.381134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g8pt7"] Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.398279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.399086 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:26.899069707 +0000 UTC m=+43.547028841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.500000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.500778 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.000761436 +0000 UTC m=+43.648720570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.572052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-b87f8" event={"ID":"e1a6c244-2da0-4dc4-8086-b3a725e8b24b","Type":"ContainerStarted","Data":"3eab5b1a58faff1830b393a2bba0c3091146101ef53b59b6abdbbdf30fae16f6"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.581561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" event={"ID":"de733605-5332-483f-b086-f97251d14bab","Type":"ContainerStarted","Data":"827803fddc0e3d9445d8ce01be2f39fea8003622c86dd0ec0f2b3f69724fed8b"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.601772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" event={"ID":"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6","Type":"ContainerStarted","Data":"38bda214b9834b29321012d1973f37bf52d864a0aa5d3a92fd3cbf4cbe8ac6fb"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.601857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" event={"ID":"4e1b6c8f-528f-41ca-bcc7-459cd6da12b6","Type":"ContainerStarted","Data":"0f559325c4f0a73bde5ba57f808cf15a566b3220b9725484efd60f9e3e46215a"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.601994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.602641 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.102624967 +0000 UTC m=+43.750584101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.603388 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.633521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" event={"ID":"81046e41-7512-4b56-b0d7-ccfa783f973a","Type":"ContainerStarted","Data":"f9e88ca889a3fa691ae969458f3c0bacc781ce6f6c744514ae057b6966b668c0"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.636253 4707 patch_prober.go:28] interesting pod/console-operator-58897d9998-sv9z4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.636297 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" podUID="4e1b6c8f-528f-41ca-bcc7-459cd6da12b6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.646734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f9dnq" event={"ID":"055d0ae1-3458-4a39-85ae-6880ec2bae14","Type":"ContainerStarted","Data":"fa27d5dc963c73d465a7b5c0d87c9de4698ed49a7bb19d648b35c4c2ef99bf99"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.662703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" event={"ID":"76d59c18-7131-4e63-bb20-3ae0e2ac8edb","Type":"ContainerStarted","Data":"0a54e7a7769f4dff76a7e13c0376e194100f0fb47c2aa2b09a74876491eab1ac"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.689280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" event={"ID":"91397883-fd02-4070-aa06-18da845bbeeb","Type":"ContainerStarted","Data":"802c1e8adbe87d8d086d99a6e95b9d0fb5c7fa3ce2d4902eb52829c23c79c96b"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.694502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" event={"ID":"6bc0f806-810b-48df-8795-fb4962e906c1","Type":"ContainerStarted","Data":"8b02539ea7924ded5ecbb0291626a60e8c3fa869bbd6442d7d0a1e1c8a32830f"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.702096 4707 generic.go:334] "Generic (PLEG): container finished" podID="384cd184-04e6-4505-b0f3-a2e367bb6dcd" containerID="af8142aec536784cf1bd57c5617df7f603b3fb9a93c031db8d1042a3793765f2" exitCode=0 Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.702230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" event={"ID":"384cd184-04e6-4505-b0f3-a2e367bb6dcd","Type":"ContainerDied","Data":"af8142aec536784cf1bd57c5617df7f603b3fb9a93c031db8d1042a3793765f2"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.706992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.724060 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.224002784 +0000 UTC m=+43.871961918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.744125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" event={"ID":"19a7180b-9bd3-4a29-8c77-e385307350cc","Type":"ContainerStarted","Data":"36022ca9cdcf2b9a8a41d43fa56dd14d02afe61396c5247752bdd30216179c09"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.744713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" event={"ID":"19a7180b-9bd3-4a29-8c77-e385307350cc","Type":"ContainerStarted","Data":"5ea23f3ab77692063ad4656284a8c5ebb808dd316243d65c36fd7045667150fd"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.811630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.816251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-snscc" event={"ID":"95bdd5db-88ec-41b6-9752-b5646a64f1ae","Type":"ContainerStarted","Data":"3edfb2f327ca2be76174cc67dacc57ade53a45948415a58779fe026860028d5c"} Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.816911 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.31688718 +0000 UTC m=+43.964846514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.854301 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9"] Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.870565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" event={"ID":"e80f686a-6a1b-433b-b994-a10eec4758ed","Type":"ContainerStarted","Data":"11657ae05da3239643531b6e996bcf072268d1dee7bf67eab1c5078b91e07645"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.877625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7g9qp" event={"ID":"5566031f-2b93-4c7f-baff-f5929fc609ed","Type":"ContainerStarted","Data":"8d828bebaa9d0b12b6e44ff46cc7291acfe81ad0b72a2520c2b31006cb7a9ca1"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.877689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7g9qp" event={"ID":"5566031f-2b93-4c7f-baff-f5929fc609ed","Type":"ContainerStarted","Data":"5d89348079c2d532a35bde326574268e3ce90bf8b07074d67cfd5772fa5ab626"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.901963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m2l7m" event={"ID":"a8657192-49a2-4c45-bc94-bbc3e2e608af","Type":"ContainerStarted","Data":"af23d1a942c6640c491cf4f05425f98e4d16a9329a0f9aee15ca80280522bfdb"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.912464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:26 crc kubenswrapper[4707]: E0218 05:48:26.915487 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.415464187 +0000 UTC m=+44.063423321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.927345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" event={"ID":"bb2eb67f-2587-4991-93cd-aeb2eed647e3","Type":"ContainerStarted","Data":"fe24b3991cb55fb90e8081f1ad860d5b4eca6a3a732e04c85a7250bafd41d496"} Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.946032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.946373 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.946426 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.953989 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns"] Feb 18 05:48:26 crc kubenswrapper[4707]: I0218 05:48:26.954443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8258q" event={"ID":"c4c25682-b294-4940-be7e-2f5eb7e7366d","Type":"ContainerStarted","Data":"c703cc86989f436255a7ddd19ddfd761fd6b29ee334dd868f87046043bf6e31c"} Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.006788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" event={"ID":"feac3e8b-68a4-4bf4-aabc-e76ef9670361","Type":"ContainerStarted","Data":"c1bbb912bd121ab3cba5543491832ccf1b504a315ba9d2fdb68d78faf48a4b32"} Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.017577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.018856 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.51884453 +0000 UTC m=+44.166803664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.022652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7f42d" event={"ID":"268d3c04-eb32-411a-8032-caa99ca62ade","Type":"ContainerStarted","Data":"a2b52b3e6f5ad2944009eac9d6be9ef02b03f1b135568bcfb42e3b95107b1def"} Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.028907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" event={"ID":"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e","Type":"ContainerStarted","Data":"78a6542b129ca205d31a5ddb6a51e4c5c685857fa9e1bcd07e3546b09dc9a6a5"} Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.028943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" event={"ID":"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e","Type":"ContainerStarted","Data":"9fc979e4412fd3cf3c792a2d0af1ca2c3084d53acd3e2015c20e07731d37f909"} Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.042654 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.052911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:48:27 crc kubenswrapper[4707]: W0218 05:48:27.056754 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode09b1e8f_752e_42dc_a638_cc7ac7179f83.slice/crio-47c30edfe08274b00114598d159b11852f6da504abbf857f112c95a3231af770 WatchSource:0}: Error finding container 47c30edfe08274b00114598d159b11852f6da504abbf857f112c95a3231af770: Status 404 returned error can't find the container with id 47c30edfe08274b00114598d159b11852f6da504abbf857f112c95a3231af770 Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.057008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.063040 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.078919 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qd7rc" podStartSLOduration=19.07888548 podStartE2EDuration="19.07888548s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.051087029 +0000 UTC m=+43.699046163" watchObservedRunningTime="2026-02-18 05:48:27.07888548 +0000 UTC m=+43.726844614" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.091785 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.121670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.123137 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.623122915 +0000 UTC m=+44.271082049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.177923 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7g9qp" podStartSLOduration=6.177902068 podStartE2EDuration="6.177902068s" podCreationTimestamp="2026-02-18 05:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.121065082 +0000 UTC m=+43.769024216" watchObservedRunningTime="2026-02-18 05:48:27.177902068 +0000 UTC m=+43.825861202" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.189054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gnrhq"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.209850 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-g4hvr" podStartSLOduration=18.209824769 podStartE2EDuration="18.209824769s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.19354814 +0000 UTC m=+43.841507274" watchObservedRunningTime="2026-02-18 05:48:27.209824769 +0000 UTC m=+43.857783903" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.212607 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hgb9s"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.223593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.224111 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.724100025 +0000 UTC m=+44.372059159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.242308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.284059 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.287050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.329319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.338053 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.837980514 +0000 UTC m=+44.485939648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.350236 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r"] Feb 18 05:48:27 crc kubenswrapper[4707]: W0218 05:48:27.377058 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfad3b2dc_a4ef_4a58_b382_993b82f4fcbc.slice/crio-00afea1d71a4a181988be460c787ed76cfa753dd691cf53ab8a9c358fe31dc2c WatchSource:0}: Error finding container 00afea1d71a4a181988be460c787ed76cfa753dd691cf53ab8a9c358fe31dc2c: Status 404 returned error can't find the container with id 00afea1d71a4a181988be460c787ed76cfa753dd691cf53ab8a9c358fe31dc2c Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.377312 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" podStartSLOduration=18.37729442 podStartE2EDuration="18.37729442s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.285698977 +0000 UTC m=+43.933658111" watchObservedRunningTime="2026-02-18 05:48:27.37729442 +0000 UTC m=+44.025253544" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.377988 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.396227 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kjq22"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.401897 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7qhxj"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.401994 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.413302 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.428888 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.428918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2"] Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.418060 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jh8cp" podStartSLOduration=19.418039773 podStartE2EDuration="19.418039773s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.375496862 +0000 UTC m=+44.023455996" watchObservedRunningTime="2026-02-18 05:48:27.418039773 +0000 UTC m=+44.065998907" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.431435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.431811 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:27.931781445 +0000 UTC m=+44.579740579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.434696 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" podStartSLOduration=19.434682631 podStartE2EDuration="19.434682631s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.429943346 +0000 UTC m=+44.077902470" watchObservedRunningTime="2026-02-18 05:48:27.434682631 +0000 UTC m=+44.082641765" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.532824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.533194 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.033178925 +0000 UTC m=+44.681138059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.599584 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-68b78" podStartSLOduration=18.599555392 podStartE2EDuration="18.599555392s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.556736605 +0000 UTC m=+44.204695739" watchObservedRunningTime="2026-02-18 05:48:27.599555392 +0000 UTC m=+44.247514526" Feb 18 05:48:27 crc kubenswrapper[4707]: W0218 05:48:27.617027 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb09044_1d2b_435d_a424_d22fbac84ca8.slice/crio-d8d62083196957ce2cfcdfa082d872c178a39f9c3dfb0240397a787a365e8629 WatchSource:0}: Error finding container d8d62083196957ce2cfcdfa082d872c178a39f9c3dfb0240397a787a365e8629: Status 404 returned error can't find the container with id d8d62083196957ce2cfcdfa082d872c178a39f9c3dfb0240397a787a365e8629 Feb 18 05:48:27 crc kubenswrapper[4707]: W0218 05:48:27.627129 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52fa1b0d_b1f0_4964_ba77_417233b07f60.slice/crio-ad7fc3e14118ca818b991440188e14563fd473bd68cc71997d2e15fa5ccd90fa WatchSource:0}: Error finding container ad7fc3e14118ca818b991440188e14563fd473bd68cc71997d2e15fa5ccd90fa: Status 404 returned error can't find the container with id ad7fc3e14118ca818b991440188e14563fd473bd68cc71997d2e15fa5ccd90fa Feb 18 05:48:27 crc kubenswrapper[4707]: W0218 05:48:27.637965 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1888d8d_c40a_4ee1_a697_6f5e97bf657d.slice/crio-4c7509611f9daf477d25ec5ba99fffa9c06290616ff5b329dec15db03697421e WatchSource:0}: Error finding container 4c7509611f9daf477d25ec5ba99fffa9c06290616ff5b329dec15db03697421e: Status 404 returned error can't find the container with id 4c7509611f9daf477d25ec5ba99fffa9c06290616ff5b329dec15db03697421e Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.638670 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" podStartSLOduration=19.638654352 podStartE2EDuration="19.638654352s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.599259765 +0000 UTC m=+44.247218889" watchObservedRunningTime="2026-02-18 05:48:27.638654352 +0000 UTC m=+44.286613486" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.640116 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-75w4w" podStartSLOduration=19.64010733 podStartE2EDuration="19.64010733s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.638273292 +0000 UTC m=+44.286232426" watchObservedRunningTime="2026-02-18 05:48:27.64010733 +0000 UTC m=+44.288066464" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.643920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.644424 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.144395133 +0000 UTC m=+44.792354267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.682904 4707 csr.go:261] certificate signing request csr-zmg9z is approved, waiting to be issued Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.700956 4707 csr.go:257] certificate signing request csr-zmg9z is issued Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.726748 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-b87f8" podStartSLOduration=18.726720431 podStartE2EDuration="18.726720431s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.683060102 +0000 UTC m=+44.331019236" watchObservedRunningTime="2026-02-18 05:48:27.726720431 +0000 UTC m=+44.374679565" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.747532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.748052 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.248029262 +0000 UTC m=+44.895988396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.828960 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c95nl" podStartSLOduration=19.828939623 podStartE2EDuration="19.828939623s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:27.823445609 +0000 UTC m=+44.471404743" watchObservedRunningTime="2026-02-18 05:48:27.828939623 +0000 UTC m=+44.476898757" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.850142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.850518 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.350503452 +0000 UTC m=+44.998462586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.937853 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:27 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:27 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:27 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.938320 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:27 crc kubenswrapper[4707]: I0218 05:48:27.951813 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:27 crc kubenswrapper[4707]: E0218 05:48:27.952279 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.452248241 +0000 UTC m=+45.100207375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.054917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.055371 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.555351356 +0000 UTC m=+45.203310490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.104203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" event={"ID":"6bc0f806-810b-48df-8795-fb4962e906c1","Type":"ContainerStarted","Data":"59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.104456 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.109996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" event={"ID":"6c734b6c-4efa-4755-b79c-2eb9d132ebcb","Type":"ContainerStarted","Data":"d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.110059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" event={"ID":"6c734b6c-4efa-4755-b79c-2eb9d132ebcb","Type":"ContainerStarted","Data":"fe601dff3cc18848f96966d10aa65189d5c5474ff069dcc4d1aca9658334335c"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.111122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.129109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" event={"ID":"c1888d8d-c40a-4ee1-a697-6f5e97bf657d","Type":"ContainerStarted","Data":"4c7509611f9daf477d25ec5ba99fffa9c06290616ff5b329dec15db03697421e"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.129266 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-g8pt7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.129305 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" podUID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.130858 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" podStartSLOduration=7.130847785 podStartE2EDuration="7.130847785s" podCreationTimestamp="2026-02-18 05:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.129297904 +0000 UTC m=+44.777257038" watchObservedRunningTime="2026-02-18 05:48:28.130847785 +0000 UTC m=+44.778806919" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.140077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" event={"ID":"de733605-5332-483f-b086-f97251d14bab","Type":"ContainerStarted","Data":"cc8d078b87b247020bfd8dea913c8eb228f583d748a4e46e548f5cf6d8464efc"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.158385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.159763 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.659747735 +0000 UTC m=+45.307706869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.171353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" event={"ID":"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc","Type":"ContainerStarted","Data":"00afea1d71a4a181988be460c787ed76cfa753dd691cf53ab8a9c358fe31dc2c"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.187622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" event={"ID":"048d19f3-df93-4565-87ea-dd7ce3d9e888","Type":"ContainerStarted","Data":"643cb4f548d15f6e44431d79c9e8c54390c09f8c7d7e9206356f82db3fd298c9"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.218345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8258q" event={"ID":"c4c25682-b294-4940-be7e-2f5eb7e7366d","Type":"ContainerStarted","Data":"74b1eef33bce0855938568bcba19fea36880ec625b8a84688f6ba0d0f4c7f34b"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.221585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" event={"ID":"15219006-aa83-4fc4-afed-79a5e7f18eda","Type":"ContainerStarted","Data":"77250a17edfa259c8653180b91ad41404e47f6c02a79f8504c2ac7d9c1e241fb"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.237023 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7b2sl" podStartSLOduration=19.237003321 podStartE2EDuration="19.237003321s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.231368382 +0000 UTC m=+44.879327516" watchObservedRunningTime="2026-02-18 05:48:28.237003321 +0000 UTC m=+44.884962455" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.240008 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" podStartSLOduration=19.239993009 podStartE2EDuration="19.239993009s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.204778222 +0000 UTC m=+44.852737356" watchObservedRunningTime="2026-02-18 05:48:28.239993009 +0000 UTC m=+44.887952143" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.248343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gnrhq" event={"ID":"6921a979-b8b5-4664-af9a-c56983db0020","Type":"ContainerStarted","Data":"93cd26dcbfae49f653389ae92eca81e478f6280ac7aaccf0b65ef8e46b1f17e2"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.259580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.260269 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.760246012 +0000 UTC m=+45.408205146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.261062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" event={"ID":"e09b1e8f-752e-42dc-a638-cc7ac7179f83","Type":"ContainerStarted","Data":"b640672c69127eb18e682c102bfc48075c6cda0d9c0141187fb4d44d2485dfea"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.261115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" event={"ID":"e09b1e8f-752e-42dc-a638-cc7ac7179f83","Type":"ContainerStarted","Data":"47c30edfe08274b00114598d159b11852f6da504abbf857f112c95a3231af770"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.275772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" event={"ID":"abbe64a9-dc0c-4b1b-931e-dddd926a91c8","Type":"ContainerStarted","Data":"7b397368815b9565fa03dc9226f8ef414584a5c743998940085c11dd22b31c2e"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.278036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" event={"ID":"ccb59e4a-55d3-436a-94c1-e926834470fb","Type":"ContainerStarted","Data":"dee3382d3a00920f9707fc6d0e5c18dcf3889a79df6b9cc5e7657e0c295c2210"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.284031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.284933 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gnrhq" podStartSLOduration=7.284916262 podStartE2EDuration="7.284916262s" podCreationTimestamp="2026-02-18 05:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.283818053 +0000 UTC m=+44.931777187" watchObservedRunningTime="2026-02-18 05:48:28.284916262 +0000 UTC m=+44.932875396" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.295147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" event={"ID":"34a7237c-6b9e-4036-8649-ab89d0ec1893","Type":"ContainerStarted","Data":"aa80d0931c525734d447e819fb2dd42ab4e99d1ff00436dcf0f8e60f49b990a8"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.309159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" event={"ID":"fdb9f9f8-7162-479f-a789-dd3e61578ec4","Type":"ContainerStarted","Data":"9efe31b164f7a7c8c81daca169c77c74c20e2c2abbef58096f0c553c5839ca0e"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.312475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7f42d" event={"ID":"268d3c04-eb32-411a-8032-caa99ca62ade","Type":"ContainerStarted","Data":"a9e753e4f44b0d6daa86007300ed89be89c3c8e6dd39b48f6d7c618b68d0e777"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.313423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" event={"ID":"52fa1b0d-b1f0-4964-ba77-417233b07f60","Type":"ContainerStarted","Data":"ad7fc3e14118ca818b991440188e14563fd473bd68cc71997d2e15fa5ccd90fa"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.331554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" event={"ID":"ddb09044-1d2b-435d-a424-d22fbac84ca8","Type":"ContainerStarted","Data":"d8d62083196957ce2cfcdfa082d872c178a39f9c3dfb0240397a787a365e8629"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.344513 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tnwns" podStartSLOduration=19.34448449 podStartE2EDuration="19.34448449s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.343481165 +0000 UTC m=+44.991440299" watchObservedRunningTime="2026-02-18 05:48:28.34448449 +0000 UTC m=+44.992443624" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.361126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.362884 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.862865824 +0000 UTC m=+45.510824948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.401478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" event={"ID":"e80f686a-6a1b-433b-b994-a10eec4758ed","Type":"ContainerStarted","Data":"8aeb1b4a2ef460a242bee3a36ee370dccf63c2afcc356b0c1e792f928988cf6a"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.418559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" event={"ID":"00e150c8-f0e4-4038-8213-cc395d64d48b","Type":"ContainerStarted","Data":"71602701a8268dfe3e529c0449f386382168b0748d2b93b177b587455bbf9e01"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.421365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" event={"ID":"0efe17cc-5e27-4a2f-80b2-1a4c7df15f7e","Type":"ContainerStarted","Data":"a66e34250e988472940091deb1690c96bdeb74b4583473f15d7da0d05676ebbe"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.428464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-snscc" event={"ID":"95bdd5db-88ec-41b6-9752-b5646a64f1ae","Type":"ContainerStarted","Data":"02f2aeb6ae430fd629cd39e1afd7ce5ea961454963ceb618f73e178b8242382f"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.444967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" event={"ID":"48f33e8e-2190-4402-9a3d-b8f7f3324da4","Type":"ContainerStarted","Data":"f1befbeae4d403ba78864dacaca184cb0e6998937e0c34a55bf294610f37558e"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.455583 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" podStartSLOduration=20.455551476 podStartE2EDuration="20.455551476s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.445417788 +0000 UTC m=+45.093376922" watchObservedRunningTime="2026-02-18 05:48:28.455551476 +0000 UTC m=+45.103510610" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.456810 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f9dnq" event={"ID":"055d0ae1-3458-4a39-85ae-6880ec2bae14","Type":"ContainerStarted","Data":"b9eec272754b68424790356d652bbad160433528caebaddf30b1e3a649bb21ba"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.457484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-f9dnq" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.468273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" event={"ID":"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2","Type":"ContainerStarted","Data":"0d15c4ff61e143c715622dbdb0ebe6e545d05e74a483efd18242e303482fbe3a"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.468338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" event={"ID":"8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2","Type":"ContainerStarted","Data":"d98c62864d61ab72c9ff48a1555dbbbd34362e37a0069c7908bcb0e85361d605"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.469074 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.469397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.471665 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:28.971640059 +0000 UTC m=+45.619599403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.481747 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qf9jw" podStartSLOduration=19.481720654 podStartE2EDuration="19.481720654s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.477302138 +0000 UTC m=+45.125261272" watchObservedRunningTime="2026-02-18 05:48:28.481720654 +0000 UTC m=+45.129679808" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.493648 4707 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pvdd9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.493702 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" podUID="8bbf7b7a-f1c2-4840-9d46-fbdeb7a94ff2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.493768 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-f9dnq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.493781 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f9dnq" podUID="055d0ae1-3458-4a39-85ae-6880ec2bae14" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.527122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" event={"ID":"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3","Type":"ContainerStarted","Data":"152da3fce42cace061af962ab7508f4de49febfbe493a4890bd2d027596e147d"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.551096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6jdbh" event={"ID":"bb2eb67f-2587-4991-93cd-aeb2eed647e3","Type":"ContainerStarted","Data":"e4bf8c6225c18931219addaaecb7dd2cd0cd997c73cf33aab7ae4c54990f6287"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.572743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.574762 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.074735225 +0000 UTC m=+45.722694359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.581134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" event={"ID":"496fadfa-36c2-47ff-a5b1-a485de9df869","Type":"ContainerStarted","Data":"018829fce589db0a674c32e143bf51c970917e8151759d7700e60f1200d1f19c"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.620482 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-f9dnq" podStartSLOduration=20.620457408 podStartE2EDuration="20.620457408s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.571186031 +0000 UTC m=+45.219145165" watchObservedRunningTime="2026-02-18 05:48:28.620457408 +0000 UTC m=+45.268416542" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.650594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" event={"ID":"384cd184-04e6-4505-b0f3-a2e367bb6dcd","Type":"ContainerStarted","Data":"42f8186c2df3110a1a5c9a0c3388fdc6d30c069e3bfb08f9002b37699ea5be1f"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.680912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.681356 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.181339532 +0000 UTC m=+45.829298666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.683264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" event={"ID":"e78651c6-6845-4f2a-a2dc-16590108a302","Type":"ContainerStarted","Data":"25a8513166e2cedf5e24a820e9f75027ce52b50abcf952f2b1e336b3fc4f403a"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.694607 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" podStartSLOduration=19.69458391 podStartE2EDuration="19.69458391s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.620671045 +0000 UTC m=+45.268630179" watchObservedRunningTime="2026-02-18 05:48:28.69458391 +0000 UTC m=+45.342543044" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.695734 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" podStartSLOduration=19.695729121 podStartE2EDuration="19.695729121s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.695061294 +0000 UTC m=+45.343020428" watchObservedRunningTime="2026-02-18 05:48:28.695729121 +0000 UTC m=+45.343688255" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.706069 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-18 05:43:27 +0000 UTC, rotation deadline is 2026-11-20 23:55:15.947846975 +0000 UTC Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.706148 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6618h6m47.241701499s for next certificate rotation Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.715465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m2l7m" event={"ID":"a8657192-49a2-4c45-bc94-bbc3e2e608af","Type":"ContainerStarted","Data":"1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.771001 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" podStartSLOduration=19.770977353 podStartE2EDuration="19.770977353s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.769572316 +0000 UTC m=+45.417531450" watchObservedRunningTime="2026-02-18 05:48:28.770977353 +0000 UTC m=+45.418936487" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.777571 4707 generic.go:334] "Generic (PLEG): container finished" podID="76d59c18-7131-4e63-bb20-3ae0e2ac8edb" containerID="2f1dfc68f351161240d11a8a848cc9765c23674a7aec3a1cd20c767142d849e8" exitCode=0 Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.779248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" event={"ID":"76d59c18-7131-4e63-bb20-3ae0e2ac8edb","Type":"ContainerDied","Data":"2f1dfc68f351161240d11a8a848cc9765c23674a7aec3a1cd20c767142d849e8"} Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.783115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.791893 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.291861223 +0000 UTC m=+45.939820357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.825244 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-m2l7m" podStartSLOduration=20.825224662 podStartE2EDuration="20.825224662s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:28.824566844 +0000 UTC m=+45.472525978" watchObservedRunningTime="2026-02-18 05:48:28.825224662 +0000 UTC m=+45.473183796" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.888811 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.895185 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.395160613 +0000 UTC m=+46.043119747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.919277 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-879zs"] Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.939129 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:28 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:28 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:28 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.939188 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:28 crc kubenswrapper[4707]: I0218 05:48:28.990343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:28 crc kubenswrapper[4707]: E0218 05:48:28.990854 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.490839083 +0000 UTC m=+46.138798217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.092305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.092863 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.592831809 +0000 UTC m=+46.240791093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.117199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sv9z4" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.196443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.197039 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.697014712 +0000 UTC m=+46.344973846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.297809 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.298300 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.79828446 +0000 UTC m=+46.446243594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.338539 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.338888 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.400535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.400951 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:29.900921683 +0000 UTC m=+46.548880817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.435842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.436292 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.504184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.504597 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.004585093 +0000 UTC m=+46.652544227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.610590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.611101 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.111080037 +0000 UTC m=+46.759039171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.712222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.712841 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.212829867 +0000 UTC m=+46.860789001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.786899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" event={"ID":"fdb9f9f8-7162-479f-a789-dd3e61578ec4","Type":"ContainerStarted","Data":"68b3e2553bd0d758fbbe9e1154663118b0672dd6676e9b1c9eec82512a86a360"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.811844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" event={"ID":"048d19f3-df93-4565-87ea-dd7ce3d9e888","Type":"ContainerStarted","Data":"2cfcc79e5e3c97ea215b37081a529b80ec89b5dddb3ce28d5e1fa609339ab542"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.816554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.816999 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.3169823 +0000 UTC m=+46.964941434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.836599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" event={"ID":"ccb59e4a-55d3-436a-94c1-e926834470fb","Type":"ContainerStarted","Data":"185508fae14eb9ca3813aa09b512c713c44b840b0aee32bd6647a5a40084bbc8"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.849456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7f42d" event={"ID":"268d3c04-eb32-411a-8032-caa99ca62ade","Type":"ContainerStarted","Data":"0b779bc9648b870856aae099ea872d535af169a67fbfc5026ee501a00c4a25f6"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.849770 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.850690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" event={"ID":"15219006-aa83-4fc4-afed-79a5e7f18eda","Type":"ContainerStarted","Data":"2b0d19d217d9053d846a5c74b356c2151df128c9b829ac37756189b0333d1bce"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.852136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" event={"ID":"ddb09044-1d2b-435d-a424-d22fbac84ca8","Type":"ContainerStarted","Data":"2dbcd88a8f9551f79a4b7c9fdade9b6ae933bdab495ff30243b6db4952122727"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.866636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" event={"ID":"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3","Type":"ContainerStarted","Data":"8738ba56ce4c66b37a375388231ca3a9ec8185e591ffdd82dcb6d5d2f53d21fc"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.882819 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vx8tz" podStartSLOduration=20.882781823 podStartE2EDuration="20.882781823s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:29.881822637 +0000 UTC m=+46.529781771" watchObservedRunningTime="2026-02-18 05:48:29.882781823 +0000 UTC m=+46.530740957" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.883577 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" podStartSLOduration=21.883571694 podStartE2EDuration="21.883571694s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:29.831956124 +0000 UTC m=+46.479915258" watchObservedRunningTime="2026-02-18 05:48:29.883571694 +0000 UTC m=+46.531530828" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.890853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gnrhq" event={"ID":"6921a979-b8b5-4664-af9a-c56983db0020","Type":"ContainerStarted","Data":"5daeacb1d8e36ea95cd54c4985241a74a2dc8044286d9dae717a0b59d076d129"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.904989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" event={"ID":"34a7237c-6b9e-4036-8649-ab89d0ec1893","Type":"ContainerStarted","Data":"b2fe0c3690b130b2b185137b565d157da0b19cee5c06b1328eef1c39a99b7243"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.905937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.919635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:29 crc kubenswrapper[4707]: E0218 05:48:29.919944 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.419932332 +0000 UTC m=+47.067891466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.929841 4707 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4sp72 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.929896 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" podUID="34a7237c-6b9e-4036-8649-ab89d0ec1893" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.936770 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:29 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:29 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:29 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.936831 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.936923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" event={"ID":"c1888d8d-c40a-4ee1-a697-6f5e97bf657d","Type":"ContainerStarted","Data":"209c30f38452107f7dee8bb4f8f7b536af114588ce6a1b1a46ae2d558834cf64"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.937462 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fp5dm" podStartSLOduration=20.937454282 podStartE2EDuration="20.937454282s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:29.937256327 +0000 UTC m=+46.585215461" watchObservedRunningTime="2026-02-18 05:48:29.937454282 +0000 UTC m=+46.585413416" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.959164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tp7zn" event={"ID":"e78651c6-6845-4f2a-a2dc-16590108a302","Type":"ContainerStarted","Data":"cb377cd765f8c631bf20581e396fcc8a2b2bf89dc28891c4ce65f51dfc753e38"} Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.974832 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" podStartSLOduration=20.974815647 podStartE2EDuration="20.974815647s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:29.97304896 +0000 UTC m=+46.621008114" watchObservedRunningTime="2026-02-18 05:48:29.974815647 +0000 UTC m=+46.622774781" Feb 18 05:48:29 crc kubenswrapper[4707]: I0218 05:48:29.978288 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" event={"ID":"52fa1b0d-b1f0-4964-ba77-417233b07f60","Type":"ContainerStarted","Data":"350f0bb69a35b0b32678a5b543c2dd654c1d5655f73773dfddc8275d851491d1"} Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.005265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" event={"ID":"496fadfa-36c2-47ff-a5b1-a485de9df869","Type":"ContainerStarted","Data":"f7ea9f280f1adeaa3f11d36ab83261d89cd7a7bc87b91e2064c48a2e62d2ef94"} Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.006464 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.013285 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7f42d" podStartSLOduration=9.013271939 podStartE2EDuration="9.013271939s" podCreationTimestamp="2026-02-18 05:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:30.012032636 +0000 UTC m=+46.659991770" watchObservedRunningTime="2026-02-18 05:48:30.013271939 +0000 UTC m=+46.661231073" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.022075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.023344 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.523325974 +0000 UTC m=+47.171285108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.040103 4707 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-j954g container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.040159 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" podUID="496fadfa-36c2-47ff-a5b1-a485de9df869" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.040415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" event={"ID":"48f33e8e-2190-4402-9a3d-b8f7f3324da4","Type":"ContainerStarted","Data":"f971cd685c28d823a0817d921dea93b4864dc9624495491524e5f14189e3e09f"} Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.064453 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" podStartSLOduration=21.064435407 podStartE2EDuration="21.064435407s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:30.064418796 +0000 UTC m=+46.712377930" watchObservedRunningTime="2026-02-18 05:48:30.064435407 +0000 UTC m=+46.712394541" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.082770 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-g8pt7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.094423 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" podUID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.088191 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-f9dnq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.094514 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f9dnq" podUID="055d0ae1-3458-4a39-85ae-6880ec2bae14" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.101391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" event={"ID":"abbe64a9-dc0c-4b1b-931e-dddd926a91c8","Type":"ContainerStarted","Data":"9a6d69d63e1071c910e150fceb4824639c85f7320b4af0541d66159e232df559"} Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.102270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvdd9" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.128474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.132246 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.632235282 +0000 UTC m=+47.280194416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.182537 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" podStartSLOduration=21.182512586 podStartE2EDuration="21.182512586s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:30.166663289 +0000 UTC m=+46.814622423" watchObservedRunningTime="2026-02-18 05:48:30.182512586 +0000 UTC m=+46.830471720" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.234947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.235760 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.735744028 +0000 UTC m=+47.383703152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.243402 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.263119 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l2jp2" podStartSLOduration=21.263100879 podStartE2EDuration="21.263100879s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:30.260270484 +0000 UTC m=+46.908229608" watchObservedRunningTime="2026-02-18 05:48:30.263100879 +0000 UTC m=+46.911060013" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.337496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.337842 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.837830537 +0000 UTC m=+47.485789671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.351906 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m7vfl" podStartSLOduration=21.351884807 podStartE2EDuration="21.351884807s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:30.351380234 +0000 UTC m=+46.999339378" watchObservedRunningTime="2026-02-18 05:48:30.351884807 +0000 UTC m=+46.999843941" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.369174 4707 patch_prober.go:28] interesting pod/apiserver-76f77b778f-k2dhh container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]log ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]etcd ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/generic-apiserver-start-informers ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/max-in-flight-filter ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 18 05:48:30 crc kubenswrapper[4707]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 18 05:48:30 crc kubenswrapper[4707]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/project.openshift.io-projectcache ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 18 05:48:30 crc kubenswrapper[4707]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 18 05:48:30 crc kubenswrapper[4707]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 18 05:48:30 crc kubenswrapper[4707]: livez check failed Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.369256 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" podUID="e80f686a-6a1b-433b-b994-a10eec4758ed" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.443505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.443828 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.943782017 +0000 UTC m=+47.591741151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.444004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.444362 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:30.944353732 +0000 UTC m=+47.592312856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.510428 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" podStartSLOduration=21.510414762 podStartE2EDuration="21.510414762s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:30.507974588 +0000 UTC m=+47.155933722" watchObservedRunningTime="2026-02-18 05:48:30.510414762 +0000 UTC m=+47.158373896" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.547310 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.547948 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.04793248 +0000 UTC m=+47.695891614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.649654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.650176 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.150158762 +0000 UTC m=+47.798117896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.750650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.750887 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.250857604 +0000 UTC m=+47.898816738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.751023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.751446 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.251428049 +0000 UTC m=+47.899387183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.851925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.852132 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.35210171 +0000 UTC m=+48.000060844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.936993 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:30 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:30 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:30 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.937073 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:30 crc kubenswrapper[4707]: I0218 05:48:30.953323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:30 crc kubenswrapper[4707]: E0218 05:48:30.953724 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.453704916 +0000 UTC m=+48.101664050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.053980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.054141 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.554117891 +0000 UTC m=+48.202077025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.054198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.054547 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.554539672 +0000 UTC m=+48.202498806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.094191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" event={"ID":"ccb59e4a-55d3-436a-94c1-e926834470fb","Type":"ContainerStarted","Data":"959b6a43c72449926ae64f0a825f3097e4ee0a1b50669610753f203eb4fc9a15"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.102229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" event={"ID":"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc","Type":"ContainerStarted","Data":"1f722c88beda3f4315d6e35b9ea445b5a4c16f9355ee62c58e4c25f10e98fae2"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.102267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" event={"ID":"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc","Type":"ContainerStarted","Data":"ad3a95b957902bb0e3f35a21690eff9fe11c92d59ad31fe37431e8a7ac1a1116"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.113209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" event={"ID":"ddb09044-1d2b-435d-a424-d22fbac84ca8","Type":"ContainerStarted","Data":"0efcff8c8bc07cbcb9b843284a6034faafe1bb5b3c4be1eb5cab60133a512186"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.116378 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-snscc" event={"ID":"95bdd5db-88ec-41b6-9752-b5646a64f1ae","Type":"ContainerStarted","Data":"a11b6b0a4a2d514fca8fce3dda8cc98c86c82f94cbceab25076d3aa601425fca"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.125266 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hgb9s" podStartSLOduration=22.125250103 podStartE2EDuration="22.125250103s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:31.124367041 +0000 UTC m=+47.772326175" watchObservedRunningTime="2026-02-18 05:48:31.125250103 +0000 UTC m=+47.773209237" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.131528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5sgn9" event={"ID":"abbe64a9-dc0c-4b1b-931e-dddd926a91c8","Type":"ContainerStarted","Data":"877fddab327827ceae9b0ced7e97e14685ccd18568f55ac1094418f019c9d88c"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.140397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" event={"ID":"52fa1b0d-b1f0-4964-ba77-417233b07f60","Type":"ContainerStarted","Data":"e290f1266406561ba68319992df221fb2eb6901bda44590c36356e0552900bc7"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.140561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.143376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" event={"ID":"76d59c18-7131-4e63-bb20-3ae0e2ac8edb","Type":"ContainerStarted","Data":"34c98053591274909fec4b929e62dee41af14c9bf934c87583bbffad0e7cbef0"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.143460 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.149461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" event={"ID":"00e150c8-f0e4-4038-8213-cc395d64d48b","Type":"ContainerStarted","Data":"a7840fa18dfcfeb3a9e6d458a518977da3b7489974cd5b92a1a27811188f69d7"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.158094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8258q" event={"ID":"c4c25682-b294-4940-be7e-2f5eb7e7366d","Type":"ContainerStarted","Data":"c55de1ecb0571b12e986567f05aa1f10bae85c1ef42922558396b4655d44bf86"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.158389 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-snscc" podStartSLOduration=22.158374736 podStartE2EDuration="22.158374736s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:31.154393611 +0000 UTC m=+47.802352745" watchObservedRunningTime="2026-02-18 05:48:31.158374736 +0000 UTC m=+47.806333870" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.158869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.160943 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.660925604 +0000 UTC m=+48.308884738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.179242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77bg6" event={"ID":"abd6a4c0-f7fb-467a-aa7d-c47cdaef68c3","Type":"ContainerStarted","Data":"0caf055d37a004936162cbd164ed28e0bb58e6e9ab1db7fa48ac5bee80fc1b46"} Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.184213 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" gracePeriod=30 Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.189049 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-f9dnq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.189102 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f9dnq" podUID="055d0ae1-3458-4a39-85ae-6880ec2bae14" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.193202 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.204004 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lhz6t" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.225308 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4sp72" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.245835 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhw9r" podStartSLOduration=22.245810899 podStartE2EDuration="22.245810899s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:31.209676758 +0000 UTC m=+47.857635892" watchObservedRunningTime="2026-02-18 05:48:31.245810899 +0000 UTC m=+47.893770033" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.273635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.278873 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.778859639 +0000 UTC m=+48.426818773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.296097 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8258q" podStartSLOduration=22.296077132 podStartE2EDuration="22.296077132s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:31.24395831 +0000 UTC m=+47.891917444" watchObservedRunningTime="2026-02-18 05:48:31.296077132 +0000 UTC m=+47.944036266" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.357265 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7qhxj" podStartSLOduration=22.357240053 podStartE2EDuration="22.357240053s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:31.351261726 +0000 UTC m=+47.999220850" watchObservedRunningTime="2026-02-18 05:48:31.357240053 +0000 UTC m=+48.005199187" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.374576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.374759 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.874731795 +0000 UTC m=+48.522690929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.374880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.375248 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.875235167 +0000 UTC m=+48.523194301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.387600 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" podStartSLOduration=23.387583493 podStartE2EDuration="23.387583493s" podCreationTimestamp="2026-02-18 05:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:31.386267708 +0000 UTC m=+48.034226852" watchObservedRunningTime="2026-02-18 05:48:31.387583493 +0000 UTC m=+48.035542627" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.419567 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" podStartSLOduration=22.419551185 podStartE2EDuration="22.419551185s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:31.417676575 +0000 UTC m=+48.065635709" watchObservedRunningTime="2026-02-18 05:48:31.419551185 +0000 UTC m=+48.067510319" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.476067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.476996 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:31.976979387 +0000 UTC m=+48.624938521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.578254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.578786 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.078764997 +0000 UTC m=+48.726724131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.629123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j954g" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.678668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.679040 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.179023418 +0000 UTC m=+48.826982552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.745034 4707 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.779950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.780364 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.280347576 +0000 UTC m=+48.928306700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.880950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.881444 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.381408647 +0000 UTC m=+49.029367781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.933629 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:31 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:31 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:31 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.933691 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.983241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:31 crc kubenswrapper[4707]: E0218 05:48:31.983566 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.483553387 +0000 UTC m=+49.131512521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.997493 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 05:48:31 crc kubenswrapper[4707]: I0218 05:48:31.998324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.001717 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.003173 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.011774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.083997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:32 crc kubenswrapper[4707]: E0218 05:48:32.084078 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.584062055 +0000 UTC m=+49.232021189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.084502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.084738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.084829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:32 crc kubenswrapper[4707]: E0218 05:48:32.085325 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.585306728 +0000 UTC m=+49.233265862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.185336 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.185365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" event={"ID":"fdb9f9f8-7162-479f-a789-dd3e61578ec4","Type":"ContainerDied","Data":"68b3e2553bd0d758fbbe9e1154663118b0672dd6676e9b1c9eec82512a86a360"} Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.185327 4707 generic.go:334] "Generic (PLEG): container finished" podID="fdb9f9f8-7162-479f-a789-dd3e61578ec4" containerID="68b3e2553bd0d758fbbe9e1154663118b0672dd6676e9b1c9eec82512a86a360" exitCode=0 Feb 18 05:48:32 crc kubenswrapper[4707]: E0218 05:48:32.185468 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.685444355 +0000 UTC m=+49.333403489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.185655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.185716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.185932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.185979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: E0218 05:48:32.186125 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 05:48:32.686113272 +0000 UTC m=+49.334072406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r74s4" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.188648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" event={"ID":"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc","Type":"ContainerStarted","Data":"53097fc9f18b23025231e7db953802f7086ceb6d953ac3c7b48747788d244647"} Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.188715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" event={"ID":"fad3b2dc-a4ef-4a58-b382-993b82f4fcbc","Type":"ContainerStarted","Data":"59f08cdb7d78db2d67be4ef33f5c5cd8221541e8a93c8824113cfc9817418958"} Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.205504 4707 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-18T05:48:31.745063947Z","Handler":null,"Name":""} Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.212061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.229203 4707 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.229244 4707 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.248627 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kjq22" podStartSLOduration=11.248609279 podStartE2EDuration="11.248609279s" podCreationTimestamp="2026-02-18 05:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:32.244697315 +0000 UTC m=+48.892656769" watchObservedRunningTime="2026-02-18 05:48:32.248609279 +0000 UTC m=+48.896568413" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.287705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.292282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.302857 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.303015 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.313034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.329086 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.393264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.402205 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.402265 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.546987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r74s4\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.669928 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5pvbq"] Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.671026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.677708 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.685975 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5pvbq"] Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.710435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-utilities\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.710538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-catalog-content\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.710586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6bxv\" (UniqueName: \"kubernetes.io/projected/37ed8460-3a60-4ec0-b074-69244d0a46cf-kube-api-access-q6bxv\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.721111 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.723301 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.808939 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-prbjm"] Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.810056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.811640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-utilities\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.811742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-catalog-content\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.811809 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6bxv\" (UniqueName: \"kubernetes.io/projected/37ed8460-3a60-4ec0-b074-69244d0a46cf-kube-api-access-q6bxv\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.812183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-utilities\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.812699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-catalog-content\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.813279 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.838511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6bxv\" (UniqueName: \"kubernetes.io/projected/37ed8460-3a60-4ec0-b074-69244d0a46cf-kube-api-access-q6bxv\") pod \"community-operators-5pvbq\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.843502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prbjm"] Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.914497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-catalog-content\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.915122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtqqv\" (UniqueName: \"kubernetes.io/projected/065bc74d-6afe-4b4b-83a6-494643b467d7-kube-api-access-xtqqv\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.915171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-utilities\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.937169 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:32 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:32 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:32 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.937241 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.962628 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r74s4"] Feb 18 05:48:32 crc kubenswrapper[4707]: W0218 05:48:32.975965 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d5877c_6bdc_4837_a2e9_1f1a751e4e2d.slice/crio-47e499d55fc1e344e023fdfd1c4cfe13fd68d5355d82f02ae80ddc871179d799 WatchSource:0}: Error finding container 47e499d55fc1e344e023fdfd1c4cfe13fd68d5355d82f02ae80ddc871179d799: Status 404 returned error can't find the container with id 47e499d55fc1e344e023fdfd1c4cfe13fd68d5355d82f02ae80ddc871179d799 Feb 18 05:48:32 crc kubenswrapper[4707]: I0218 05:48:32.996737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.015028 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m9tn5"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.016055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.017300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-utilities\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.017371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-catalog-content\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.017420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtqqv\" (UniqueName: \"kubernetes.io/projected/065bc74d-6afe-4b4b-83a6-494643b467d7-kube-api-access-xtqqv\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.018085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-utilities\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.018094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-catalog-content\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.036945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9tn5"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.041382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtqqv\" (UniqueName: \"kubernetes.io/projected/065bc74d-6afe-4b4b-83a6-494643b467d7-kube-api-access-xtqqv\") pod \"certified-operators-prbjm\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.167208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.211164 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pwkrp"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.212899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" event={"ID":"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d","Type":"ContainerStarted","Data":"ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148"} Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.212944 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.212958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" event={"ID":"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d","Type":"ContainerStarted","Data":"47e499d55fc1e344e023fdfd1c4cfe13fd68d5355d82f02ae80ddc871179d799"} Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.213060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.216614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6a9780f2-7a17-4f4c-9e5d-96c929cd7760","Type":"ContainerStarted","Data":"2130033955f17db7d80d13c8fc2af8ede2fd695fa8fdefa5c56a617a846e7240"} Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.216648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6a9780f2-7a17-4f4c-9e5d-96c929cd7760","Type":"ContainerStarted","Data":"414ff6ddfeda2435634900387bdd4ed5c9d1e3142e375b2ed51604ee77b5793f"} Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.221079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-catalog-content\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.221239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-utilities\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.221287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/78c0afb9-5c9c-48b6-8d4e-458b99f37300-kube-api-access-l8jfx\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.231559 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwkrp"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.238806 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" podStartSLOduration=24.238755354 podStartE2EDuration="24.238755354s" podCreationTimestamp="2026-02-18 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:33.234945084 +0000 UTC m=+49.882904218" watchObservedRunningTime="2026-02-18 05:48:33.238755354 +0000 UTC m=+49.886714488" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.312902 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.312873156 podStartE2EDuration="2.312873156s" podCreationTimestamp="2026-02-18 05:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:33.282832205 +0000 UTC m=+49.930791339" watchObservedRunningTime="2026-02-18 05:48:33.312873156 +0000 UTC m=+49.960832290" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.323212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjknc\" (UniqueName: \"kubernetes.io/projected/09b6ba7d-96fc-469f-8b02-f57b06081e67-kube-api-access-mjknc\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.323321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-utilities\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.323384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/78c0afb9-5c9c-48b6-8d4e-458b99f37300-kube-api-access-l8jfx\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.323417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-utilities\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.323514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-catalog-content\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.323589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-catalog-content\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.326236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-catalog-content\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.327259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-utilities\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.352259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/78c0afb9-5c9c-48b6-8d4e-458b99f37300-kube-api-access-l8jfx\") pod \"community-operators-m9tn5\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.355393 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5pvbq"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.358891 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.424978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-utilities\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.425096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-catalog-content\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.425174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjknc\" (UniqueName: \"kubernetes.io/projected/09b6ba7d-96fc-469f-8b02-f57b06081e67-kube-api-access-mjknc\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.425676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-utilities\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.426074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-catalog-content\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.434031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.449903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjknc\" (UniqueName: \"kubernetes.io/projected/09b6ba7d-96fc-469f-8b02-f57b06081e67-kube-api-access-mjknc\") pod \"certified-operators-pwkrp\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.450187 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.525120 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prbjm"] Feb 18 05:48:33 crc kubenswrapper[4707]: W0218 05:48:33.550778 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065bc74d_6afe_4b4b_83a6_494643b467d7.slice/crio-0ecd4e337220e93fb4702b33042090c4f6ea010e6818fe6079f71df2354bd693 WatchSource:0}: Error finding container 0ecd4e337220e93fb4702b33042090c4f6ea010e6818fe6079f71df2354bd693: Status 404 returned error can't find the container with id 0ecd4e337220e93fb4702b33042090c4f6ea010e6818fe6079f71df2354bd693 Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.551614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.743193 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9tn5"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.866888 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wd4tk" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.873305 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.919777 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwkrp"] Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.940968 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:33 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:33 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:33 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.941016 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:33 crc kubenswrapper[4707]: I0218 05:48:33.950413 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.950398746 podStartE2EDuration="950.398746ms" podCreationTimestamp="2026-02-18 05:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:33.919943684 +0000 UTC m=+50.567902818" watchObservedRunningTime="2026-02-18 05:48:33.950398746 +0000 UTC m=+50.598357880" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.040038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4bm6\" (UniqueName: \"kubernetes.io/projected/fdb9f9f8-7162-479f-a789-dd3e61578ec4-kube-api-access-h4bm6\") pod \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.040323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb9f9f8-7162-479f-a789-dd3e61578ec4-config-volume\") pod \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.041158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb9f9f8-7162-479f-a789-dd3e61578ec4-config-volume" (OuterVolumeSpecName: "config-volume") pod "fdb9f9f8-7162-479f-a789-dd3e61578ec4" (UID: "fdb9f9f8-7162-479f-a789-dd3e61578ec4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.041274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdb9f9f8-7162-479f-a789-dd3e61578ec4-secret-volume\") pod \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\" (UID: \"fdb9f9f8-7162-479f-a789-dd3e61578ec4\") " Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.042425 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb9f9f8-7162-479f-a789-dd3e61578ec4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.049691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb9f9f8-7162-479f-a789-dd3e61578ec4-kube-api-access-h4bm6" (OuterVolumeSpecName: "kube-api-access-h4bm6") pod "fdb9f9f8-7162-479f-a789-dd3e61578ec4" (UID: "fdb9f9f8-7162-479f-a789-dd3e61578ec4"). InnerVolumeSpecName "kube-api-access-h4bm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.055064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb9f9f8-7162-479f-a789-dd3e61578ec4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fdb9f9f8-7162-479f-a789-dd3e61578ec4" (UID: "fdb9f9f8-7162-479f-a789-dd3e61578ec4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.069304 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.143316 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4bm6\" (UniqueName: \"kubernetes.io/projected/fdb9f9f8-7162-479f-a789-dd3e61578ec4-kube-api-access-h4bm6\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.143348 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fdb9f9f8-7162-479f-a789-dd3e61578ec4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.230558 4707 generic.go:334] "Generic (PLEG): container finished" podID="6a9780f2-7a17-4f4c-9e5d-96c929cd7760" containerID="2130033955f17db7d80d13c8fc2af8ede2fd695fa8fdefa5c56a617a846e7240" exitCode=0 Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.230926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6a9780f2-7a17-4f4c-9e5d-96c929cd7760","Type":"ContainerDied","Data":"2130033955f17db7d80d13c8fc2af8ede2fd695fa8fdefa5c56a617a846e7240"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.233361 4707 generic.go:334] "Generic (PLEG): container finished" podID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerID="9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90" exitCode=0 Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.233447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbjm" event={"ID":"065bc74d-6afe-4b4b-83a6-494643b467d7","Type":"ContainerDied","Data":"9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.233481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbjm" event={"ID":"065bc74d-6afe-4b4b-83a6-494643b467d7","Type":"ContainerStarted","Data":"0ecd4e337220e93fb4702b33042090c4f6ea010e6818fe6079f71df2354bd693"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.235999 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.237122 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.238452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl" event={"ID":"fdb9f9f8-7162-479f-a789-dd3e61578ec4","Type":"ContainerDied","Data":"9efe31b164f7a7c8c81daca169c77c74c20e2c2abbef58096f0c553c5839ca0e"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.238525 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9efe31b164f7a7c8c81daca169c77c74c20e2c2abbef58096f0c553c5839ca0e" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.245823 4707 generic.go:334] "Generic (PLEG): container finished" podID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerID="087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1" exitCode=0 Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.245958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9tn5" event={"ID":"78c0afb9-5c9c-48b6-8d4e-458b99f37300","Type":"ContainerDied","Data":"087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.245990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9tn5" event={"ID":"78c0afb9-5c9c-48b6-8d4e-458b99f37300","Type":"ContainerStarted","Data":"295bd1cf47e3ae6877c578712053718bd25f13890a0056e8ae8ea414b440a674"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.259608 4707 generic.go:334] "Generic (PLEG): container finished" podID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerID="a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d" exitCode=0 Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.259728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwkrp" event={"ID":"09b6ba7d-96fc-469f-8b02-f57b06081e67","Type":"ContainerDied","Data":"a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.259769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwkrp" event={"ID":"09b6ba7d-96fc-469f-8b02-f57b06081e67","Type":"ContainerStarted","Data":"85bca17c3bd545a41382a6df72700cee7647e8cb7c34efc1621d9689e192f015"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.276087 4707 generic.go:334] "Generic (PLEG): container finished" podID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerID="22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123" exitCode=0 Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.276522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pvbq" event={"ID":"37ed8460-3a60-4ec0-b074-69244d0a46cf","Type":"ContainerDied","Data":"22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.276570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pvbq" event={"ID":"37ed8460-3a60-4ec0-b074-69244d0a46cf","Type":"ContainerStarted","Data":"bfed1abc20e4a3900fe3cf218d7787cc511569c90d1bc8df85591940fa740395"} Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.354635 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.359889 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-k2dhh" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.627403 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4bx4t"] Feb 18 05:48:34 crc kubenswrapper[4707]: E0218 05:48:34.627667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb9f9f8-7162-479f-a789-dd3e61578ec4" containerName="collect-profiles" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.627684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb9f9f8-7162-479f-a789-dd3e61578ec4" containerName="collect-profiles" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.627779 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb9f9f8-7162-479f-a789-dd3e61578ec4" containerName="collect-profiles" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.628494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.631421 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.647812 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bx4t"] Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.756182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnpgm\" (UniqueName: \"kubernetes.io/projected/c2a7cf05-4af5-406e-8395-75b3634484e9-kube-api-access-tnpgm\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.756259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-catalog-content\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.756558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-utilities\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.839669 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-f9dnq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.839726 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f9dnq" podUID="055d0ae1-3458-4a39-85ae-6880ec2bae14" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.839683 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-f9dnq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.839855 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-f9dnq" podUID="055d0ae1-3458-4a39-85ae-6880ec2bae14" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.857904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-catalog-content\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.858041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-utilities\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.858091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnpgm\" (UniqueName: \"kubernetes.io/projected/c2a7cf05-4af5-406e-8395-75b3634484e9-kube-api-access-tnpgm\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.858606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-catalog-content\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.858773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-utilities\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.864398 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.864431 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.866595 4707 patch_prober.go:28] interesting pod/console-f9d7485db-m2l7m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.866649 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-m2l7m" podUID="a8657192-49a2-4c45-bc94-bbc3e2e608af" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.892353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnpgm\" (UniqueName: \"kubernetes.io/projected/c2a7cf05-4af5-406e-8395-75b3634484e9-kube-api-access-tnpgm\") pod \"redhat-marketplace-4bx4t\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.930278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.936772 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:34 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:34 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:34 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.936874 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:34 crc kubenswrapper[4707]: I0218 05:48:34.944652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.005357 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2c97d"] Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.006708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.022328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c97d"] Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.161940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-utilities\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.162025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2hwn\" (UniqueName: \"kubernetes.io/projected/797df0f0-6b39-46a7-ad2b-fbf0276b311e-kube-api-access-l2hwn\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.162088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-catalog-content\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.222100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bx4t"] Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.263946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-utilities\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.264020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2hwn\" (UniqueName: \"kubernetes.io/projected/797df0f0-6b39-46a7-ad2b-fbf0276b311e-kube-api-access-l2hwn\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.264087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-catalog-content\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.265190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-catalog-content\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.266055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-utilities\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.286638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2hwn\" (UniqueName: \"kubernetes.io/projected/797df0f0-6b39-46a7-ad2b-fbf0276b311e-kube-api-access-l2hwn\") pod \"redhat-marketplace-2c97d\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: W0218 05:48:35.290356 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a7cf05_4af5_406e_8395_75b3634484e9.slice/crio-5b651ab6ad623b27a73275b2d42f410ae1ecc02c93d710a459511531f5cb922a WatchSource:0}: Error finding container 5b651ab6ad623b27a73275b2d42f410ae1ecc02c93d710a459511531f5cb922a: Status 404 returned error can't find the container with id 5b651ab6ad623b27a73275b2d42f410ae1ecc02c93d710a459511531f5cb922a Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.346686 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:48:35 crc kubenswrapper[4707]: E0218 05:48:35.369318 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:35 crc kubenswrapper[4707]: E0218 05:48:35.371206 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:35 crc kubenswrapper[4707]: E0218 05:48:35.375709 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:35 crc kubenswrapper[4707]: E0218 05:48:35.375834 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" containerName="kube-multus-additional-cni-plugins" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.615162 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.643437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c97d"] Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.669987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.670709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.670770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.670822 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.672756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.677059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.677580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.679090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.736865 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.744385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.771689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kubelet-dir\") pod \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.771755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kube-api-access\") pod \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\" (UID: \"6a9780f2-7a17-4f4c-9e5d-96c929cd7760\") " Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.772744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6a9780f2-7a17-4f4c-9e5d-96c929cd7760" (UID: "6a9780f2-7a17-4f4c-9e5d-96c929cd7760"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.775442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6a9780f2-7a17-4f4c-9e5d-96c929cd7760" (UID: "6a9780f2-7a17-4f4c-9e5d-96c929cd7760"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.807606 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k69vm"] Feb 18 05:48:35 crc kubenswrapper[4707]: E0218 05:48:35.807879 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9780f2-7a17-4f4c-9e5d-96c929cd7760" containerName="pruner" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.807895 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9780f2-7a17-4f4c-9e5d-96c929cd7760" containerName="pruner" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.808033 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9780f2-7a17-4f4c-9e5d-96c929cd7760" containerName="pruner" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.808753 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.810713 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.821284 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k69vm"] Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.874700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9llf\" (UniqueName: \"kubernetes.io/projected/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-kube-api-access-s9llf\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.874751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-catalog-content\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.874823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-utilities\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.875159 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.875190 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9780f2-7a17-4f4c-9e5d-96c929cd7760-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.937521 4707 patch_prober.go:28] interesting pod/router-default-5444994796-b87f8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 05:48:35 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Feb 18 05:48:35 crc kubenswrapper[4707]: [+]process-running ok Feb 18 05:48:35 crc kubenswrapper[4707]: healthz check failed Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.937586 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b87f8" podUID="e1a6c244-2da0-4dc4-8086-b3a725e8b24b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.970954 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.977969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9llf\" (UniqueName: \"kubernetes.io/projected/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-kube-api-access-s9llf\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.978020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-catalog-content\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.978705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-utilities\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.980713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-utilities\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:35 crc kubenswrapper[4707]: I0218 05:48:35.982204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-catalog-content\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.017451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9llf\" (UniqueName: \"kubernetes.io/projected/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-kube-api-access-s9llf\") pod \"redhat-operators-k69vm\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:36 crc kubenswrapper[4707]: W0218 05:48:36.175537 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-2c086f8363418b4eced6d3511d1b10c308f9bd468611b58dd6ca6dff9da368a7 WatchSource:0}: Error finding container 2c086f8363418b4eced6d3511d1b10c308f9bd468611b58dd6ca6dff9da368a7: Status 404 returned error can't find the container with id 2c086f8363418b4eced6d3511d1b10c308f9bd468611b58dd6ca6dff9da368a7 Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.214863 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vg87g"] Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.216280 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.228413 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vg87g"] Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.235722 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.287518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-catalog-content\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.287899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-utilities\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.288047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq8cj\" (UniqueName: \"kubernetes.io/projected/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-kube-api-access-sq8cj\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.305939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2c086f8363418b4eced6d3511d1b10c308f9bd468611b58dd6ca6dff9da368a7"} Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.312940 4707 generic.go:334] "Generic (PLEG): container finished" podID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerID="4c440f7d5939af4f5e7e8cc827b3a267d526b17524eb90e0c9dbfc743b80dd9b" exitCode=0 Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.313355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c97d" event={"ID":"797df0f0-6b39-46a7-ad2b-fbf0276b311e","Type":"ContainerDied","Data":"4c440f7d5939af4f5e7e8cc827b3a267d526b17524eb90e0c9dbfc743b80dd9b"} Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.313410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c97d" event={"ID":"797df0f0-6b39-46a7-ad2b-fbf0276b311e","Type":"ContainerStarted","Data":"f28a1442fd57b138a53293f3d9e12c091adb1790bf5cb0b0a96195ee23e4e093"} Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.316835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6a9780f2-7a17-4f4c-9e5d-96c929cd7760","Type":"ContainerDied","Data":"414ff6ddfeda2435634900387bdd4ed5c9d1e3142e375b2ed51604ee77b5793f"} Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.316881 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="414ff6ddfeda2435634900387bdd4ed5c9d1e3142e375b2ed51604ee77b5793f" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.316969 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.348700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"58211b89d69f76ce88054ae0c460f6edb685f1e536fa88434a312eb240ece466"} Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.354094 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerID="d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a" exitCode=0 Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.354134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bx4t" event={"ID":"c2a7cf05-4af5-406e-8395-75b3634484e9","Type":"ContainerDied","Data":"d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a"} Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.354159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bx4t" event={"ID":"c2a7cf05-4af5-406e-8395-75b3634484e9","Type":"ContainerStarted","Data":"5b651ab6ad623b27a73275b2d42f410ae1ecc02c93d710a459511531f5cb922a"} Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.389255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq8cj\" (UniqueName: \"kubernetes.io/projected/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-kube-api-access-sq8cj\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.389311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-catalog-content\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.389333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-utilities\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.389910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-utilities\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.389952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-catalog-content\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.415282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq8cj\" (UniqueName: \"kubernetes.io/projected/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-kube-api-access-sq8cj\") pod \"redhat-operators-vg87g\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.541705 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.801740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k69vm"] Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.957648 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:36 crc kubenswrapper[4707]: I0218 05:48:36.985752 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-b87f8" Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.068663 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vg87g"] Feb 18 05:48:37 crc kubenswrapper[4707]: W0218 05:48:37.133163 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod830a9ed8_4e81_46f1_a66d_7ed41abdc1b0.slice/crio-a6008620ce0be507151310ad0220324374fb4f2dee0363ea0a391bf04cc907d7 WatchSource:0}: Error finding container a6008620ce0be507151310ad0220324374fb4f2dee0363ea0a391bf04cc907d7: Status 404 returned error can't find the container with id a6008620ce0be507151310ad0220324374fb4f2dee0363ea0a391bf04cc907d7 Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.363128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7cc9aad39fd6f04f971e0c3437e55048704913ea00f03034012135eec3ff2227"} Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.363175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6ee383da6a6ba94b8363abaa53ed5e38db5d8413ff26c7508bb835a7d75339ed"} Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.364219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.378990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"868984fd15f46af15c48f3e86d8c3ced0282a246c5d23db2278b2d13e07863e9"} Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.410014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vg87g" event={"ID":"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0","Type":"ContainerStarted","Data":"a6008620ce0be507151310ad0220324374fb4f2dee0363ea0a391bf04cc907d7"} Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.439132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k69vm" event={"ID":"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a","Type":"ContainerDied","Data":"0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b"} Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.438708 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerID="0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b" exitCode=0 Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.439846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k69vm" event={"ID":"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a","Type":"ContainerStarted","Data":"6f66e4905eccaf215893167932a35b55110b4b94ca455c5c7493005cef95c107"} Feb 18 05:48:37 crc kubenswrapper[4707]: I0218 05:48:37.452175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"233bbdbea86e9230359f0f3b28d99066c527b398a128b5c56fc5cea84c352721"} Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.477578 4707 generic.go:334] "Generic (PLEG): container finished" podID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerID="c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97" exitCode=0 Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.477647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vg87g" event={"ID":"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0","Type":"ContainerDied","Data":"c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97"} Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.860411 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.862044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.868776 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.868788 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.874306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.984521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cab7e275-d387-4735-a395-003a0ffaa255-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:38 crc kubenswrapper[4707]: I0218 05:48:38.984566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cab7e275-d387-4735-a395-003a0ffaa255-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:39 crc kubenswrapper[4707]: I0218 05:48:39.085694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cab7e275-d387-4735-a395-003a0ffaa255-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:39 crc kubenswrapper[4707]: I0218 05:48:39.085757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cab7e275-d387-4735-a395-003a0ffaa255-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:39 crc kubenswrapper[4707]: I0218 05:48:39.086005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cab7e275-d387-4735-a395-003a0ffaa255-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:39 crc kubenswrapper[4707]: I0218 05:48:39.110389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cab7e275-d387-4735-a395-003a0ffaa255-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:39 crc kubenswrapper[4707]: I0218 05:48:39.187613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:39 crc kubenswrapper[4707]: I0218 05:48:39.553994 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 05:48:39 crc kubenswrapper[4707]: I0218 05:48:39.836329 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7f42d" Feb 18 05:48:40 crc kubenswrapper[4707]: I0218 05:48:40.507729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cab7e275-d387-4735-a395-003a0ffaa255","Type":"ContainerStarted","Data":"9e374bc8054cf7c3942e58492e54cd7dbeeb6dcd56f3196744b33f5bf5ef5675"} Feb 18 05:48:41 crc kubenswrapper[4707]: I0218 05:48:41.517171 4707 generic.go:334] "Generic (PLEG): container finished" podID="cab7e275-d387-4735-a395-003a0ffaa255" containerID="1a5e880ce4d7b02e7658365e2e57f6fae09adcc8e03ee8170db307e9d18b268d" exitCode=0 Feb 18 05:48:41 crc kubenswrapper[4707]: I0218 05:48:41.517724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cab7e275-d387-4735-a395-003a0ffaa255","Type":"ContainerDied","Data":"1a5e880ce4d7b02e7658365e2e57f6fae09adcc8e03ee8170db307e9d18b268d"} Feb 18 05:48:44 crc kubenswrapper[4707]: I0218 05:48:44.857113 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-f9dnq" Feb 18 05:48:44 crc kubenswrapper[4707]: I0218 05:48:44.880202 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:44 crc kubenswrapper[4707]: I0218 05:48:44.896993 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 05:48:45 crc kubenswrapper[4707]: E0218 05:48:45.368525 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:45 crc kubenswrapper[4707]: E0218 05:48:45.370774 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:45 crc kubenswrapper[4707]: E0218 05:48:45.373452 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:45 crc kubenswrapper[4707]: E0218 05:48:45.373491 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" containerName="kube-multus-additional-cni-plugins" Feb 18 05:48:52 crc kubenswrapper[4707]: I0218 05:48:52.730693 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.071887 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.314216 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.348303 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.348284217 podStartE2EDuration="348.284217ms" podCreationTimestamp="2026-02-18 05:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:48:54.345400511 +0000 UTC m=+70.993359645" watchObservedRunningTime="2026-02-18 05:48:54.348284217 +0000 UTC m=+70.996243361" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.429166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cab7e275-d387-4735-a395-003a0ffaa255-kubelet-dir\") pod \"cab7e275-d387-4735-a395-003a0ffaa255\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.429272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cab7e275-d387-4735-a395-003a0ffaa255-kube-api-access\") pod \"cab7e275-d387-4735-a395-003a0ffaa255\" (UID: \"cab7e275-d387-4735-a395-003a0ffaa255\") " Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.429298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cab7e275-d387-4735-a395-003a0ffaa255-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cab7e275-d387-4735-a395-003a0ffaa255" (UID: "cab7e275-d387-4735-a395-003a0ffaa255"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.430458 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cab7e275-d387-4735-a395-003a0ffaa255-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.440675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cab7e275-d387-4735-a395-003a0ffaa255-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cab7e275-d387-4735-a395-003a0ffaa255" (UID: "cab7e275-d387-4735-a395-003a0ffaa255"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.532233 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cab7e275-d387-4735-a395-003a0ffaa255-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.984262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cab7e275-d387-4735-a395-003a0ffaa255","Type":"ContainerDied","Data":"9e374bc8054cf7c3942e58492e54cd7dbeeb6dcd56f3196744b33f5bf5ef5675"} Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.984408 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e374bc8054cf7c3942e58492e54cd7dbeeb6dcd56f3196744b33f5bf5ef5675" Feb 18 05:48:54 crc kubenswrapper[4707]: I0218 05:48:54.984673 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 05:48:55 crc kubenswrapper[4707]: E0218 05:48:55.371687 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:55 crc kubenswrapper[4707]: E0218 05:48:55.373681 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:55 crc kubenswrapper[4707]: E0218 05:48:55.375716 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 05:48:55 crc kubenswrapper[4707]: E0218 05:48:55.375752 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:02 crc kubenswrapper[4707]: I0218 05:49:02.023414 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-879zs_6bc0f806-810b-48df-8795-fb4962e906c1/kube-multus-additional-cni-plugins/0.log" Feb 18 05:49:02 crc kubenswrapper[4707]: I0218 05:49:02.024116 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bc0f806-810b-48df-8795-fb4962e906c1" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" exitCode=137 Feb 18 05:49:02 crc kubenswrapper[4707]: I0218 05:49:02.024151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" event={"ID":"6bc0f806-810b-48df-8795-fb4962e906c1","Type":"ContainerDied","Data":"59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df"} Feb 18 05:49:02 crc kubenswrapper[4707]: E0218 05:49:02.180045 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 18 05:49:02 crc kubenswrapper[4707]: E0218 05:49:02.180302 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtqqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-prbjm_openshift-marketplace(065bc74d-6afe-4b4b-83a6-494643b467d7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:02 crc kubenswrapper[4707]: E0218 05:49:02.181556 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-prbjm" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" Feb 18 05:49:02 crc kubenswrapper[4707]: E0218 05:49:02.222186 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 05:49:02 crc kubenswrapper[4707]: E0218 05:49:02.222364 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8jfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-m9tn5_openshift-marketplace(78c0afb9-5c9c-48b6-8d4e-458b99f37300): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:02 crc kubenswrapper[4707]: E0218 05:49:02.223524 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-m9tn5" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.653769 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-prbjm" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.654183 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-m9tn5" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.717416 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-879zs_6bc0f806-810b-48df-8795-fb4962e906c1/kube-multus-additional-cni-plugins/0.log" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.717971 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.761433 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.761664 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2hwn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2c97d_openshift-marketplace(797df0f0-6b39-46a7-ad2b-fbf0276b311e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.762886 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2c97d" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.811885 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.812284 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnpgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4bx4t_openshift-marketplace(c2a7cf05-4af5-406e-8395-75b3634484e9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.813923 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4bx4t" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.816071 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.817745 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjknc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-pwkrp_openshift-marketplace(09b6ba7d-96fc-469f-8b02-f57b06081e67): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 05:49:03 crc kubenswrapper[4707]: E0218 05:49:03.819183 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-pwkrp" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.859689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bc0f806-810b-48df-8795-fb4962e906c1-cni-sysctl-allowlist\") pod \"6bc0f806-810b-48df-8795-fb4962e906c1\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.859751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bc0f806-810b-48df-8795-fb4962e906c1-tuning-conf-dir\") pod \"6bc0f806-810b-48df-8795-fb4962e906c1\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.859857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6bc0f806-810b-48df-8795-fb4962e906c1-ready\") pod \"6bc0f806-810b-48df-8795-fb4962e906c1\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.859901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mchtc\" (UniqueName: \"kubernetes.io/projected/6bc0f806-810b-48df-8795-fb4962e906c1-kube-api-access-mchtc\") pod \"6bc0f806-810b-48df-8795-fb4962e906c1\" (UID: \"6bc0f806-810b-48df-8795-fb4962e906c1\") " Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.860074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bc0f806-810b-48df-8795-fb4962e906c1-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "6bc0f806-810b-48df-8795-fb4962e906c1" (UID: "6bc0f806-810b-48df-8795-fb4962e906c1"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.860236 4707 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bc0f806-810b-48df-8795-fb4962e906c1-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.860427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc0f806-810b-48df-8795-fb4962e906c1-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "6bc0f806-810b-48df-8795-fb4962e906c1" (UID: "6bc0f806-810b-48df-8795-fb4962e906c1"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.860602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc0f806-810b-48df-8795-fb4962e906c1-ready" (OuterVolumeSpecName: "ready") pod "6bc0f806-810b-48df-8795-fb4962e906c1" (UID: "6bc0f806-810b-48df-8795-fb4962e906c1"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.867291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc0f806-810b-48df-8795-fb4962e906c1-kube-api-access-mchtc" (OuterVolumeSpecName: "kube-api-access-mchtc") pod "6bc0f806-810b-48df-8795-fb4962e906c1" (UID: "6bc0f806-810b-48df-8795-fb4962e906c1"). InnerVolumeSpecName "kube-api-access-mchtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.961965 4707 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bc0f806-810b-48df-8795-fb4962e906c1-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.962022 4707 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6bc0f806-810b-48df-8795-fb4962e906c1-ready\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:03 crc kubenswrapper[4707]: I0218 05:49:03.962039 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mchtc\" (UniqueName: \"kubernetes.io/projected/6bc0f806-810b-48df-8795-fb4962e906c1-kube-api-access-mchtc\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.040808 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pvbq" event={"ID":"37ed8460-3a60-4ec0-b074-69244d0a46cf","Type":"ContainerStarted","Data":"c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded"} Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.044542 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-879zs_6bc0f806-810b-48df-8795-fb4962e906c1/kube-multus-additional-cni-plugins/0.log" Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.044628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" event={"ID":"6bc0f806-810b-48df-8795-fb4962e906c1","Type":"ContainerDied","Data":"8b02539ea7924ded5ecbb0291626a60e8c3fa869bbd6442d7d0a1e1c8a32830f"} Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.044678 4707 scope.go:117] "RemoveContainer" containerID="59ed9e2e01f5e78ce8cc9c64ad9565aa2c952bf781000552d0a73888886af8df" Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.044727 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-879zs" Feb 18 05:49:04 crc kubenswrapper[4707]: E0218 05:49:04.077184 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2c97d" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.077943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vg87g" event={"ID":"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0","Type":"ContainerStarted","Data":"581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677"} Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.077980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k69vm" event={"ID":"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a","Type":"ContainerStarted","Data":"5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d"} Feb 18 05:49:04 crc kubenswrapper[4707]: E0218 05:49:04.079977 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4bx4t" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" Feb 18 05:49:04 crc kubenswrapper[4707]: E0218 05:49:04.080020 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-pwkrp" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.166737 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-879zs"] Feb 18 05:49:04 crc kubenswrapper[4707]: I0218 05:49:04.174347 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-879zs"] Feb 18 05:49:05 crc kubenswrapper[4707]: I0218 05:49:05.107883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pc79b" Feb 18 05:49:05 crc kubenswrapper[4707]: I0218 05:49:05.208058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pvbq" event={"ID":"37ed8460-3a60-4ec0-b074-69244d0a46cf","Type":"ContainerDied","Data":"c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded"} Feb 18 05:49:05 crc kubenswrapper[4707]: I0218 05:49:05.207987 4707 generic.go:334] "Generic (PLEG): container finished" podID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerID="c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded" exitCode=0 Feb 18 05:49:06 crc kubenswrapper[4707]: I0218 05:49:06.063233 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" path="/var/lib/kubelet/pods/6bc0f806-810b-48df-8795-fb4962e906c1/volumes" Feb 18 05:49:06 crc kubenswrapper[4707]: I0218 05:49:06.221735 4707 generic.go:334] "Generic (PLEG): container finished" podID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerID="581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677" exitCode=0 Feb 18 05:49:06 crc kubenswrapper[4707]: I0218 05:49:06.221850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vg87g" event={"ID":"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0","Type":"ContainerDied","Data":"581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677"} Feb 18 05:49:06 crc kubenswrapper[4707]: I0218 05:49:06.225873 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerID="5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d" exitCode=0 Feb 18 05:49:06 crc kubenswrapper[4707]: I0218 05:49:06.225929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k69vm" event={"ID":"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a","Type":"ContainerDied","Data":"5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d"} Feb 18 05:49:06 crc kubenswrapper[4707]: I0218 05:49:06.232202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pvbq" event={"ID":"37ed8460-3a60-4ec0-b074-69244d0a46cf","Type":"ContainerStarted","Data":"f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3"} Feb 18 05:49:06 crc kubenswrapper[4707]: I0218 05:49:06.292942 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5pvbq" podStartSLOduration=2.958269028 podStartE2EDuration="34.292914336s" podCreationTimestamp="2026-02-18 05:48:32 +0000 UTC" firstStartedPulling="2026-02-18 05:48:34.282593654 +0000 UTC m=+50.930552788" lastFinishedPulling="2026-02-18 05:49:05.617238962 +0000 UTC m=+82.265198096" observedRunningTime="2026-02-18 05:49:06.289437714 +0000 UTC m=+82.937396858" watchObservedRunningTime="2026-02-18 05:49:06.292914336 +0000 UTC m=+82.940873470" Feb 18 05:49:07 crc kubenswrapper[4707]: I0218 05:49:07.247214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vg87g" event={"ID":"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0","Type":"ContainerStarted","Data":"7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886"} Feb 18 05:49:07 crc kubenswrapper[4707]: I0218 05:49:07.251574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k69vm" event={"ID":"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a","Type":"ContainerStarted","Data":"31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd"} Feb 18 05:49:07 crc kubenswrapper[4707]: I0218 05:49:07.277908 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vg87g" podStartSLOduration=3.110633944 podStartE2EDuration="31.277871355s" podCreationTimestamp="2026-02-18 05:48:36 +0000 UTC" firstStartedPulling="2026-02-18 05:48:38.482022039 +0000 UTC m=+55.129981173" lastFinishedPulling="2026-02-18 05:49:06.64925945 +0000 UTC m=+83.297218584" observedRunningTime="2026-02-18 05:49:07.276497959 +0000 UTC m=+83.924457083" watchObservedRunningTime="2026-02-18 05:49:07.277871355 +0000 UTC m=+83.925830489" Feb 18 05:49:07 crc kubenswrapper[4707]: I0218 05:49:07.312495 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k69vm" podStartSLOduration=2.996094882 podStartE2EDuration="32.312463447s" podCreationTimestamp="2026-02-18 05:48:35 +0000 UTC" firstStartedPulling="2026-02-18 05:48:37.442065991 +0000 UTC m=+54.090025115" lastFinishedPulling="2026-02-18 05:49:06.758434546 +0000 UTC m=+83.406393680" observedRunningTime="2026-02-18 05:49:07.309486298 +0000 UTC m=+83.957445432" watchObservedRunningTime="2026-02-18 05:49:07.312463447 +0000 UTC m=+83.960422581" Feb 18 05:49:12 crc kubenswrapper[4707]: I0218 05:49:12.996933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:49:12 crc kubenswrapper[4707]: I0218 05:49:12.997524 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.140124 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.345855 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.627139 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 05:49:13 crc kubenswrapper[4707]: E0218 05:49:13.627417 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab7e275-d387-4735-a395-003a0ffaa255" containerName="pruner" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.627455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab7e275-d387-4735-a395-003a0ffaa255" containerName="pruner" Feb 18 05:49:13 crc kubenswrapper[4707]: E0218 05:49:13.627465 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.627472 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.627584 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc0f806-810b-48df-8795-fb4962e906c1" containerName="kube-multus-additional-cni-plugins" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.627600 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab7e275-d387-4735-a395-003a0ffaa255" containerName="pruner" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.628079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.631898 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.633729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.642727 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.711948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.712124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.813709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.813780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.813956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.855367 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:13 crc kubenswrapper[4707]: I0218 05:49:13.947183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:14 crc kubenswrapper[4707]: I0218 05:49:14.457859 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 05:49:15 crc kubenswrapper[4707]: I0218 05:49:15.297142 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9tn5" event={"ID":"78c0afb9-5c9c-48b6-8d4e-458b99f37300","Type":"ContainerStarted","Data":"19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747"} Feb 18 05:49:15 crc kubenswrapper[4707]: I0218 05:49:15.299103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dfa8964f-4759-4c8d-83fb-c5d78c8201cb","Type":"ContainerStarted","Data":"bde9c7701c8c2deef88dc9f7125925345989601da2bff04111c718f0b2c1a57b"} Feb 18 05:49:15 crc kubenswrapper[4707]: I0218 05:49:15.299217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dfa8964f-4759-4c8d-83fb-c5d78c8201cb","Type":"ContainerStarted","Data":"6e8fac9e39c739464eece707800848b27117aec1c1f8351e432cdb1231e0bb81"} Feb 18 05:49:15 crc kubenswrapper[4707]: I0218 05:49:15.324022 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.323993875 podStartE2EDuration="2.323993875s" podCreationTimestamp="2026-02-18 05:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:15.319547158 +0000 UTC m=+91.967506312" watchObservedRunningTime="2026-02-18 05:49:15.323993875 +0000 UTC m=+91.971953009" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.004196 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.237116 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.237520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.287278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.314122 4707 generic.go:334] "Generic (PLEG): container finished" podID="dfa8964f-4759-4c8d-83fb-c5d78c8201cb" containerID="bde9c7701c8c2deef88dc9f7125925345989601da2bff04111c718f0b2c1a57b" exitCode=0 Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.314271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dfa8964f-4759-4c8d-83fb-c5d78c8201cb","Type":"ContainerDied","Data":"bde9c7701c8c2deef88dc9f7125925345989601da2bff04111c718f0b2c1a57b"} Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.320636 4707 generic.go:334] "Generic (PLEG): container finished" podID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerID="19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747" exitCode=0 Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.320687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9tn5" event={"ID":"78c0afb9-5c9c-48b6-8d4e-458b99f37300","Type":"ContainerDied","Data":"19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747"} Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.327100 4707 generic.go:334] "Generic (PLEG): container finished" podID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerID="159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440" exitCode=0 Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.327601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbjm" event={"ID":"065bc74d-6afe-4b4b-83a6-494643b467d7","Type":"ContainerDied","Data":"159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440"} Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.380761 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.542358 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.542410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:49:16 crc kubenswrapper[4707]: I0218 05:49:16.584748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:49:17 crc kubenswrapper[4707]: I0218 05:49:17.374469 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.376820 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vg87g"] Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.435091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.494016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kubelet-dir\") pod \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.494157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kube-api-access\") pod \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\" (UID: \"dfa8964f-4759-4c8d-83fb-c5d78c8201cb\") " Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.500809 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dfa8964f-4759-4c8d-83fb-c5d78c8201cb" (UID: "dfa8964f-4759-4c8d-83fb-c5d78c8201cb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.500963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dfa8964f-4759-4c8d-83fb-c5d78c8201cb" (UID: "dfa8964f-4759-4c8d-83fb-c5d78c8201cb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.595253 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:18 crc kubenswrapper[4707]: I0218 05:49:18.595284 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dfa8964f-4759-4c8d-83fb-c5d78c8201cb-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.357156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dfa8964f-4759-4c8d-83fb-c5d78c8201cb","Type":"ContainerDied","Data":"6e8fac9e39c739464eece707800848b27117aec1c1f8351e432cdb1231e0bb81"} Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.357777 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8fac9e39c739464eece707800848b27117aec1c1f8351e432cdb1231e0bb81" Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.357861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.363679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9tn5" event={"ID":"78c0afb9-5c9c-48b6-8d4e-458b99f37300","Type":"ContainerStarted","Data":"b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785"} Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.366587 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerID="2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b" exitCode=0 Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.366647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bx4t" event={"ID":"c2a7cf05-4af5-406e-8395-75b3634484e9","Type":"ContainerDied","Data":"2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b"} Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.373741 4707 generic.go:334] "Generic (PLEG): container finished" podID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerID="c8da2f46ce4256c8bdafe3d6c4c057f9704779b85d4ae241823d36716fe3e932" exitCode=0 Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.373818 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c97d" event={"ID":"797df0f0-6b39-46a7-ad2b-fbf0276b311e","Type":"ContainerDied","Data":"c8da2f46ce4256c8bdafe3d6c4c057f9704779b85d4ae241823d36716fe3e932"} Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.381733 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vg87g" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="registry-server" containerID="cri-o://7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886" gracePeriod=2 Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.381852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbjm" event={"ID":"065bc74d-6afe-4b4b-83a6-494643b467d7","Type":"ContainerStarted","Data":"1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7"} Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.406739 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m9tn5" podStartSLOduration=3.23241982 podStartE2EDuration="47.406718786s" podCreationTimestamp="2026-02-18 05:48:32 +0000 UTC" firstStartedPulling="2026-02-18 05:48:34.247891891 +0000 UTC m=+50.895851025" lastFinishedPulling="2026-02-18 05:49:18.422190857 +0000 UTC m=+95.070149991" observedRunningTime="2026-02-18 05:49:19.390749365 +0000 UTC m=+96.038708519" watchObservedRunningTime="2026-02-18 05:49:19.406718786 +0000 UTC m=+96.054677920" Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.457498 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-prbjm" podStartSLOduration=3.271914499 podStartE2EDuration="47.457482483s" podCreationTimestamp="2026-02-18 05:48:32 +0000 UTC" firstStartedPulling="2026-02-18 05:48:34.23570388 +0000 UTC m=+50.883663014" lastFinishedPulling="2026-02-18 05:49:18.421271864 +0000 UTC m=+95.069230998" observedRunningTime="2026-02-18 05:49:19.455754118 +0000 UTC m=+96.103713252" watchObservedRunningTime="2026-02-18 05:49:19.457482483 +0000 UTC m=+96.105441617" Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.784847 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.913929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-catalog-content\") pod \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.914356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-utilities\") pod \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.914430 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq8cj\" (UniqueName: \"kubernetes.io/projected/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-kube-api-access-sq8cj\") pod \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\" (UID: \"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0\") " Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.915137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-utilities" (OuterVolumeSpecName: "utilities") pod "830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" (UID: "830a9ed8-4e81-46f1-a66d-7ed41abdc1b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:19 crc kubenswrapper[4707]: I0218 05:49:19.921255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-kube-api-access-sq8cj" (OuterVolumeSpecName: "kube-api-access-sq8cj") pod "830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" (UID: "830a9ed8-4e81-46f1-a66d-7ed41abdc1b0"). InnerVolumeSpecName "kube-api-access-sq8cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.016052 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.016089 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq8cj\" (UniqueName: \"kubernetes.io/projected/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-kube-api-access-sq8cj\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.045437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" (UID: "830a9ed8-4e81-46f1-a66d-7ed41abdc1b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.117516 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.391348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c97d" event={"ID":"797df0f0-6b39-46a7-ad2b-fbf0276b311e","Type":"ContainerStarted","Data":"bfaa0cbe8f830e1aa9295e590b8497ea39a97445f06bf89ba2f43d1cda61f6eb"} Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.395230 4707 generic.go:334] "Generic (PLEG): container finished" podID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerID="7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886" exitCode=0 Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.395325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vg87g" event={"ID":"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0","Type":"ContainerDied","Data":"7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886"} Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.395403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vg87g" event={"ID":"830a9ed8-4e81-46f1-a66d-7ed41abdc1b0","Type":"ContainerDied","Data":"a6008620ce0be507151310ad0220324374fb4f2dee0363ea0a391bf04cc907d7"} Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.395426 4707 scope.go:117] "RemoveContainer" containerID="7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.395584 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vg87g" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.399863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bx4t" event={"ID":"c2a7cf05-4af5-406e-8395-75b3634484e9","Type":"ContainerStarted","Data":"40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5"} Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.403691 4707 generic.go:334] "Generic (PLEG): container finished" podID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerID="1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631" exitCode=0 Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.404063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwkrp" event={"ID":"09b6ba7d-96fc-469f-8b02-f57b06081e67","Type":"ContainerDied","Data":"1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631"} Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.415037 4707 scope.go:117] "RemoveContainer" containerID="581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.420429 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2c97d" podStartSLOduration=2.9963902449999997 podStartE2EDuration="46.420412463s" podCreationTimestamp="2026-02-18 05:48:34 +0000 UTC" firstStartedPulling="2026-02-18 05:48:36.381898341 +0000 UTC m=+53.029857475" lastFinishedPulling="2026-02-18 05:49:19.805920549 +0000 UTC m=+96.453879693" observedRunningTime="2026-02-18 05:49:20.417983088 +0000 UTC m=+97.065942242" watchObservedRunningTime="2026-02-18 05:49:20.420412463 +0000 UTC m=+97.068371597" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.437898 4707 scope.go:117] "RemoveContainer" containerID="c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.458157 4707 scope.go:117] "RemoveContainer" containerID="7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886" Feb 18 05:49:20 crc kubenswrapper[4707]: E0218 05:49:20.458685 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886\": container with ID starting with 7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886 not found: ID does not exist" containerID="7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.458734 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886"} err="failed to get container status \"7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886\": rpc error: code = NotFound desc = could not find container \"7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886\": container with ID starting with 7e6353ce452d507a88685a0b54e78213e74c4686642c22415c09db4dcfcc6886 not found: ID does not exist" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.458809 4707 scope.go:117] "RemoveContainer" containerID="581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677" Feb 18 05:49:20 crc kubenswrapper[4707]: E0218 05:49:20.459598 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677\": container with ID starting with 581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677 not found: ID does not exist" containerID="581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.459626 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677"} err="failed to get container status \"581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677\": rpc error: code = NotFound desc = could not find container \"581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677\": container with ID starting with 581c79c12c25b5a88143c3b0eecaf8b72e81302772a630a6a23449df2d659677 not found: ID does not exist" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.459643 4707 scope.go:117] "RemoveContainer" containerID="c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97" Feb 18 05:49:20 crc kubenswrapper[4707]: E0218 05:49:20.460020 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97\": container with ID starting with c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97 not found: ID does not exist" containerID="c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.460058 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97"} err="failed to get container status \"c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97\": rpc error: code = NotFound desc = could not find container \"c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97\": container with ID starting with c2871d49af817854de0a793a6f5dde77712122249547742097f8c13c07b2ff97 not found: ID does not exist" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.474090 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4bx4t" podStartSLOduration=3.090439771 podStartE2EDuration="46.474078936s" podCreationTimestamp="2026-02-18 05:48:34 +0000 UTC" firstStartedPulling="2026-02-18 05:48:36.381502931 +0000 UTC m=+53.029462065" lastFinishedPulling="2026-02-18 05:49:19.765142096 +0000 UTC m=+96.413101230" observedRunningTime="2026-02-18 05:49:20.470357057 +0000 UTC m=+97.118316191" watchObservedRunningTime="2026-02-18 05:49:20.474078936 +0000 UTC m=+97.122038070" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.486805 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vg87g"] Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.489395 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vg87g"] Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.624134 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 05:49:20 crc kubenswrapper[4707]: E0218 05:49:20.624399 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="extract-utilities" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.624413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="extract-utilities" Feb 18 05:49:20 crc kubenswrapper[4707]: E0218 05:49:20.624422 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa8964f-4759-4c8d-83fb-c5d78c8201cb" containerName="pruner" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.624428 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa8964f-4759-4c8d-83fb-c5d78c8201cb" containerName="pruner" Feb 18 05:49:20 crc kubenswrapper[4707]: E0218 05:49:20.624439 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="registry-server" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.624446 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="registry-server" Feb 18 05:49:20 crc kubenswrapper[4707]: E0218 05:49:20.624458 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="extract-content" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.624464 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="extract-content" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.624596 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa8964f-4759-4c8d-83fb-c5d78c8201cb" containerName="pruner" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.624608 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" containerName="registry-server" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.625082 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.627296 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.630034 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.637602 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.727164 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-var-lock\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.727248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kube-api-access\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.727299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.828529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kube-api-access\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.828623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.828685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-var-lock\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.828767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-var-lock\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.829167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.857021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kube-api-access\") pod \"installer-9-crc\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:20 crc kubenswrapper[4707]: I0218 05:49:20.944296 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:49:21 crc kubenswrapper[4707]: I0218 05:49:21.176845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 05:49:21 crc kubenswrapper[4707]: I0218 05:49:21.411300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9cf16072-3e25-4a57-b95a-86cb7a8681ac","Type":"ContainerStarted","Data":"91f9a1e1ae597b1b5ab9da75864b98b1dbd59a211bfd7348aa24816cd80c3985"} Feb 18 05:49:22 crc kubenswrapper[4707]: I0218 05:49:22.063169 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830a9ed8-4e81-46f1-a66d-7ed41abdc1b0" path="/var/lib/kubelet/pods/830a9ed8-4e81-46f1-a66d-7ed41abdc1b0/volumes" Feb 18 05:49:22 crc kubenswrapper[4707]: I0218 05:49:22.424022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9cf16072-3e25-4a57-b95a-86cb7a8681ac","Type":"ContainerStarted","Data":"195136b7352dde7e4d4e2c868cb7d4b4fe92e6095e42b93d86c2c983aaa9ff12"} Feb 18 05:49:22 crc kubenswrapper[4707]: I0218 05:49:22.426228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwkrp" event={"ID":"09b6ba7d-96fc-469f-8b02-f57b06081e67","Type":"ContainerStarted","Data":"4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd"} Feb 18 05:49:22 crc kubenswrapper[4707]: I0218 05:49:22.455951 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.455918508 podStartE2EDuration="2.455918508s" podCreationTimestamp="2026-02-18 05:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:49:22.446253204 +0000 UTC m=+99.094212378" watchObservedRunningTime="2026-02-18 05:49:22.455918508 +0000 UTC m=+99.103877682" Feb 18 05:49:22 crc kubenswrapper[4707]: I0218 05:49:22.479276 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pwkrp" podStartSLOduration=2.00044668 podStartE2EDuration="49.479251543s" podCreationTimestamp="2026-02-18 05:48:33 +0000 UTC" firstStartedPulling="2026-02-18 05:48:34.263391669 +0000 UTC m=+50.911350803" lastFinishedPulling="2026-02-18 05:49:21.742196532 +0000 UTC m=+98.390155666" observedRunningTime="2026-02-18 05:49:22.4753402 +0000 UTC m=+99.123299364" watchObservedRunningTime="2026-02-18 05:49:22.479251543 +0000 UTC m=+99.127210677" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.167596 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.168158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.218344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.360085 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.360137 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.404228 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.481606 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.484477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.553038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:49:23 crc kubenswrapper[4707]: I0218 05:49:23.553114 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:49:24 crc kubenswrapper[4707]: I0218 05:49:24.596377 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pwkrp" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="registry-server" probeResult="failure" output=< Feb 18 05:49:24 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 05:49:24 crc kubenswrapper[4707]: > Feb 18 05:49:24 crc kubenswrapper[4707]: I0218 05:49:24.945895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:49:24 crc kubenswrapper[4707]: I0218 05:49:24.946520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:49:25 crc kubenswrapper[4707]: I0218 05:49:25.009090 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:49:25 crc kubenswrapper[4707]: I0218 05:49:25.346906 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:49:25 crc kubenswrapper[4707]: I0218 05:49:25.346979 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:49:25 crc kubenswrapper[4707]: I0218 05:49:25.391045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:49:25 crc kubenswrapper[4707]: I0218 05:49:25.483945 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:49:25 crc kubenswrapper[4707]: I0218 05:49:25.491528 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:49:26 crc kubenswrapper[4707]: I0218 05:49:26.777316 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9tn5"] Feb 18 05:49:26 crc kubenswrapper[4707]: I0218 05:49:26.778319 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m9tn5" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="registry-server" containerID="cri-o://b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785" gracePeriod=2 Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.174536 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.228486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/78c0afb9-5c9c-48b6-8d4e-458b99f37300-kube-api-access-l8jfx\") pod \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.228608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-catalog-content\") pod \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.228646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-utilities\") pod \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\" (UID: \"78c0afb9-5c9c-48b6-8d4e-458b99f37300\") " Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.230207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-utilities" (OuterVolumeSpecName: "utilities") pod "78c0afb9-5c9c-48b6-8d4e-458b99f37300" (UID: "78c0afb9-5c9c-48b6-8d4e-458b99f37300"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.239755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c0afb9-5c9c-48b6-8d4e-458b99f37300-kube-api-access-l8jfx" (OuterVolumeSpecName: "kube-api-access-l8jfx") pod "78c0afb9-5c9c-48b6-8d4e-458b99f37300" (UID: "78c0afb9-5c9c-48b6-8d4e-458b99f37300"). InnerVolumeSpecName "kube-api-access-l8jfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.294469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78c0afb9-5c9c-48b6-8d4e-458b99f37300" (UID: "78c0afb9-5c9c-48b6-8d4e-458b99f37300"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.330216 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.330426 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c0afb9-5c9c-48b6-8d4e-458b99f37300-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.330500 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/78c0afb9-5c9c-48b6-8d4e-458b99f37300-kube-api-access-l8jfx\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.461181 4707 generic.go:334] "Generic (PLEG): container finished" podID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerID="b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785" exitCode=0 Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.461261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9tn5" event={"ID":"78c0afb9-5c9c-48b6-8d4e-458b99f37300","Type":"ContainerDied","Data":"b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785"} Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.461318 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9tn5" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.461331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9tn5" event={"ID":"78c0afb9-5c9c-48b6-8d4e-458b99f37300","Type":"ContainerDied","Data":"295bd1cf47e3ae6877c578712053718bd25f13890a0056e8ae8ea414b440a674"} Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.461378 4707 scope.go:117] "RemoveContainer" containerID="b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.485146 4707 scope.go:117] "RemoveContainer" containerID="19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.499804 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9tn5"] Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.504279 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m9tn5"] Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.517735 4707 scope.go:117] "RemoveContainer" containerID="087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.548098 4707 scope.go:117] "RemoveContainer" containerID="b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785" Feb 18 05:49:27 crc kubenswrapper[4707]: E0218 05:49:27.548701 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785\": container with ID starting with b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785 not found: ID does not exist" containerID="b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.548734 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785"} err="failed to get container status \"b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785\": rpc error: code = NotFound desc = could not find container \"b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785\": container with ID starting with b513b04a0448ac97650faa255f8365dbc46eba63b81629528efeb5fdd0fac785 not found: ID does not exist" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.548757 4707 scope.go:117] "RemoveContainer" containerID="19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747" Feb 18 05:49:27 crc kubenswrapper[4707]: E0218 05:49:27.549111 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747\": container with ID starting with 19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747 not found: ID does not exist" containerID="19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.549131 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747"} err="failed to get container status \"19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747\": rpc error: code = NotFound desc = could not find container \"19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747\": container with ID starting with 19f3656092950b0b7d47045c5e4781e9159d3b0b6f9570eaef1e518bc3d88747 not found: ID does not exist" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.549183 4707 scope.go:117] "RemoveContainer" containerID="087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1" Feb 18 05:49:27 crc kubenswrapper[4707]: E0218 05:49:27.549726 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1\": container with ID starting with 087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1 not found: ID does not exist" containerID="087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1" Feb 18 05:49:27 crc kubenswrapper[4707]: I0218 05:49:27.549819 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1"} err="failed to get container status \"087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1\": rpc error: code = NotFound desc = could not find container \"087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1\": container with ID starting with 087bbf5215e6361ef08162b420bec4242b285f98083726aedc12786e300873b1 not found: ID does not exist" Feb 18 05:49:28 crc kubenswrapper[4707]: I0218 05:49:28.064236 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" path="/var/lib/kubelet/pods/78c0afb9-5c9c-48b6-8d4e-458b99f37300/volumes" Feb 18 05:49:29 crc kubenswrapper[4707]: I0218 05:49:29.180261 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c97d"] Feb 18 05:49:29 crc kubenswrapper[4707]: I0218 05:49:29.180574 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2c97d" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="registry-server" containerID="cri-o://bfaa0cbe8f830e1aa9295e590b8497ea39a97445f06bf89ba2f43d1cda61f6eb" gracePeriod=2 Feb 18 05:49:29 crc kubenswrapper[4707]: I0218 05:49:29.478110 4707 generic.go:334] "Generic (PLEG): container finished" podID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerID="bfaa0cbe8f830e1aa9295e590b8497ea39a97445f06bf89ba2f43d1cda61f6eb" exitCode=0 Feb 18 05:49:29 crc kubenswrapper[4707]: I0218 05:49:29.478153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c97d" event={"ID":"797df0f0-6b39-46a7-ad2b-fbf0276b311e","Type":"ContainerDied","Data":"bfaa0cbe8f830e1aa9295e590b8497ea39a97445f06bf89ba2f43d1cda61f6eb"} Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.341147 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.475721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-utilities\") pod \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.475846 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2hwn\" (UniqueName: \"kubernetes.io/projected/797df0f0-6b39-46a7-ad2b-fbf0276b311e-kube-api-access-l2hwn\") pod \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.475938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-catalog-content\") pod \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\" (UID: \"797df0f0-6b39-46a7-ad2b-fbf0276b311e\") " Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.476406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-utilities" (OuterVolumeSpecName: "utilities") pod "797df0f0-6b39-46a7-ad2b-fbf0276b311e" (UID: "797df0f0-6b39-46a7-ad2b-fbf0276b311e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.477176 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.480541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797df0f0-6b39-46a7-ad2b-fbf0276b311e-kube-api-access-l2hwn" (OuterVolumeSpecName: "kube-api-access-l2hwn") pod "797df0f0-6b39-46a7-ad2b-fbf0276b311e" (UID: "797df0f0-6b39-46a7-ad2b-fbf0276b311e"). InnerVolumeSpecName "kube-api-access-l2hwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.486267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c97d" event={"ID":"797df0f0-6b39-46a7-ad2b-fbf0276b311e","Type":"ContainerDied","Data":"f28a1442fd57b138a53293f3d9e12c091adb1790bf5cb0b0a96195ee23e4e093"} Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.486318 4707 scope.go:117] "RemoveContainer" containerID="bfaa0cbe8f830e1aa9295e590b8497ea39a97445f06bf89ba2f43d1cda61f6eb" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.486435 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c97d" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.498886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "797df0f0-6b39-46a7-ad2b-fbf0276b311e" (UID: "797df0f0-6b39-46a7-ad2b-fbf0276b311e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.506719 4707 scope.go:117] "RemoveContainer" containerID="c8da2f46ce4256c8bdafe3d6c4c057f9704779b85d4ae241823d36716fe3e932" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.523274 4707 scope.go:117] "RemoveContainer" containerID="4c440f7d5939af4f5e7e8cc827b3a267d526b17524eb90e0c9dbfc743b80dd9b" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.578756 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2hwn\" (UniqueName: \"kubernetes.io/projected/797df0f0-6b39-46a7-ad2b-fbf0276b311e-kube-api-access-l2hwn\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.578815 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797df0f0-6b39-46a7-ad2b-fbf0276b311e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.822841 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c97d"] Feb 18 05:49:30 crc kubenswrapper[4707]: I0218 05:49:30.826931 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c97d"] Feb 18 05:49:32 crc kubenswrapper[4707]: I0218 05:49:32.062653 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" path="/var/lib/kubelet/pods/797df0f0-6b39-46a7-ad2b-fbf0276b311e/volumes" Feb 18 05:49:32 crc kubenswrapper[4707]: I0218 05:49:32.239727 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cbn6t"] Feb 18 05:49:33 crc kubenswrapper[4707]: I0218 05:49:33.590510 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:49:33 crc kubenswrapper[4707]: I0218 05:49:33.642525 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:49:34 crc kubenswrapper[4707]: I0218 05:49:34.980248 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwkrp"] Feb 18 05:49:35 crc kubenswrapper[4707]: I0218 05:49:35.517747 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pwkrp" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="registry-server" containerID="cri-o://4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd" gracePeriod=2 Feb 18 05:49:35 crc kubenswrapper[4707]: I0218 05:49:35.882835 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:49:35 crc kubenswrapper[4707]: I0218 05:49:35.952019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-utilities\") pod \"09b6ba7d-96fc-469f-8b02-f57b06081e67\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " Feb 18 05:49:35 crc kubenswrapper[4707]: I0218 05:49:35.952104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjknc\" (UniqueName: \"kubernetes.io/projected/09b6ba7d-96fc-469f-8b02-f57b06081e67-kube-api-access-mjknc\") pod \"09b6ba7d-96fc-469f-8b02-f57b06081e67\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " Feb 18 05:49:35 crc kubenswrapper[4707]: I0218 05:49:35.952197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-catalog-content\") pod \"09b6ba7d-96fc-469f-8b02-f57b06081e67\" (UID: \"09b6ba7d-96fc-469f-8b02-f57b06081e67\") " Feb 18 05:49:35 crc kubenswrapper[4707]: I0218 05:49:35.952948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-utilities" (OuterVolumeSpecName: "utilities") pod "09b6ba7d-96fc-469f-8b02-f57b06081e67" (UID: "09b6ba7d-96fc-469f-8b02-f57b06081e67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:35 crc kubenswrapper[4707]: I0218 05:49:35.959011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b6ba7d-96fc-469f-8b02-f57b06081e67-kube-api-access-mjknc" (OuterVolumeSpecName: "kube-api-access-mjknc") pod "09b6ba7d-96fc-469f-8b02-f57b06081e67" (UID: "09b6ba7d-96fc-469f-8b02-f57b06081e67"). InnerVolumeSpecName "kube-api-access-mjknc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.010677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09b6ba7d-96fc-469f-8b02-f57b06081e67" (UID: "09b6ba7d-96fc-469f-8b02-f57b06081e67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.053166 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.053201 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjknc\" (UniqueName: \"kubernetes.io/projected/09b6ba7d-96fc-469f-8b02-f57b06081e67-kube-api-access-mjknc\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.053213 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b6ba7d-96fc-469f-8b02-f57b06081e67-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.528348 4707 generic.go:334] "Generic (PLEG): container finished" podID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerID="4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd" exitCode=0 Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.528485 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwkrp" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.528432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwkrp" event={"ID":"09b6ba7d-96fc-469f-8b02-f57b06081e67","Type":"ContainerDied","Data":"4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd"} Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.528874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwkrp" event={"ID":"09b6ba7d-96fc-469f-8b02-f57b06081e67","Type":"ContainerDied","Data":"85bca17c3bd545a41382a6df72700cee7647e8cb7c34efc1621d9689e192f015"} Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.529317 4707 scope.go:117] "RemoveContainer" containerID="4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.550307 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwkrp"] Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.550891 4707 scope.go:117] "RemoveContainer" containerID="1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.555548 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pwkrp"] Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.577327 4707 scope.go:117] "RemoveContainer" containerID="a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.591258 4707 scope.go:117] "RemoveContainer" containerID="4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd" Feb 18 05:49:36 crc kubenswrapper[4707]: E0218 05:49:36.591713 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd\": container with ID starting with 4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd not found: ID does not exist" containerID="4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.591761 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd"} err="failed to get container status \"4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd\": rpc error: code = NotFound desc = could not find container \"4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd\": container with ID starting with 4313d762a396f31f9ff8ec6de379102d50268553b5f2a95a117b5e5f086bb2fd not found: ID does not exist" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.591807 4707 scope.go:117] "RemoveContainer" containerID="1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631" Feb 18 05:49:36 crc kubenswrapper[4707]: E0218 05:49:36.592101 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631\": container with ID starting with 1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631 not found: ID does not exist" containerID="1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.592136 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631"} err="failed to get container status \"1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631\": rpc error: code = NotFound desc = could not find container \"1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631\": container with ID starting with 1300164d57a10efb71fe8e9c0fb2c23b4127637e330a8f8b193a5b405a1d4631 not found: ID does not exist" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.592155 4707 scope.go:117] "RemoveContainer" containerID="a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d" Feb 18 05:49:36 crc kubenswrapper[4707]: E0218 05:49:36.592372 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d\": container with ID starting with a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d not found: ID does not exist" containerID="a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d" Feb 18 05:49:36 crc kubenswrapper[4707]: I0218 05:49:36.592402 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d"} err="failed to get container status \"a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d\": rpc error: code = NotFound desc = could not find container \"a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d\": container with ID starting with a53ae95ee8b1292302b63fa446c24a86a210143f66756ada9d4e0bca28a4a91d not found: ID does not exist" Feb 18 05:49:38 crc kubenswrapper[4707]: I0218 05:49:38.062371 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" path="/var/lib/kubelet/pods/09b6ba7d-96fc-469f-8b02-f57b06081e67/volumes" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.271629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" podUID="ec70ffb1-f091-47ed-b947-6af13fd6d34f" containerName="oauth-openshift" containerID="cri-o://ee367f57fe225be18c2d2923000ffd6d9b1f3315dc09041d9e525b02ec090edf" gracePeriod=15 Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.656995 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec70ffb1-f091-47ed-b947-6af13fd6d34f" containerID="ee367f57fe225be18c2d2923000ffd6d9b1f3315dc09041d9e525b02ec090edf" exitCode=0 Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.657132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" event={"ID":"ec70ffb1-f091-47ed-b947-6af13fd6d34f","Type":"ContainerDied","Data":"ee367f57fe225be18c2d2923000ffd6d9b1f3315dc09041d9e525b02ec090edf"} Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.744043 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.791359 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6f67d677dd-mntdx"] Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792265 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792285 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792311 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792319 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792345 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="extract-content" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792356 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="extract-content" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec70ffb1-f091-47ed-b947-6af13fd6d34f" containerName="oauth-openshift" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792379 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec70ffb1-f091-47ed-b947-6af13fd6d34f" containerName="oauth-openshift" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792405 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792431 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="extract-utilities" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792442 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="extract-utilities" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792459 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="extract-utilities" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="extract-utilities" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="extract-content" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="extract-content" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792504 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="extract-utilities" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792511 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="extract-utilities" Feb 18 05:49:57 crc kubenswrapper[4707]: E0218 05:49:57.792528 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="extract-content" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792538 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="extract-content" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792845 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c0afb9-5c9c-48b6-8d4e-458b99f37300" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792870 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec70ffb1-f091-47ed-b947-6af13fd6d34f" containerName="oauth-openshift" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792889 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="797df0f0-6b39-46a7-ad2b-fbf0276b311e" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.792897 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b6ba7d-96fc-469f-8b02-f57b06081e67" containerName="registry-server" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.793830 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.813774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f67d677dd-mntdx"] Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-router-certs\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bcjs\" (UniqueName: \"kubernetes.io/projected/ec70ffb1-f091-47ed-b947-6af13fd6d34f-kube-api-access-7bcjs\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-cliconfig\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-provider-selection\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-login\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-error\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-policies\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-ocp-branding-template\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-session\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-service-ca\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-dir\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-serving-cert\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-idp-0-file-data\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-trusted-ca-bundle\") pod \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\" (UID: \"ec70ffb1-f091-47ed-b947-6af13fd6d34f\") " Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-error\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87z8k\" (UniqueName: \"kubernetes.io/projected/c5124644-4434-468a-be7c-d91ae9458450-kube-api-access-87z8k\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.852985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-session\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5124644-4434-468a-be7c-d91ae9458450-audit-dir\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-audit-policies\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.853706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.854398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.854439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-login\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.854430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.854448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.854513 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.854862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.855295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.860298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec70ffb1-f091-47ed-b947-6af13fd6d34f-kube-api-access-7bcjs" (OuterVolumeSpecName: "kube-api-access-7bcjs") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "kube-api-access-7bcjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.860709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.862483 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.863050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.863879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.864348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.864625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.868183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.868326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ec70ffb1-f091-47ed-b947-6af13fd6d34f" (UID: "ec70ffb1-f091-47ed-b947-6af13fd6d34f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-error\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87z8k\" (UniqueName: \"kubernetes.io/projected/c5124644-4434-468a-be7c-d91ae9458450-kube-api-access-87z8k\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-session\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5124644-4434-468a-be7c-d91ae9458450-audit-dir\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.956987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-audit-policies\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-login\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957383 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957412 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957434 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957456 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957478 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957501 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957523 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957544 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957564 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957585 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957604 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bcjs\" (UniqueName: \"kubernetes.io/projected/ec70ffb1-f091-47ed-b947-6af13fd6d34f-kube-api-access-7bcjs\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957625 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957645 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec70ffb1-f091-47ed-b947-6af13fd6d34f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.957623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5124644-4434-468a-be7c-d91ae9458450-audit-dir\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.959819 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.960507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.961535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.962019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.962071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-session\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.962319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.962597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-error\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.963163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5124644-4434-468a-be7c-d91ae9458450-audit-policies\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.964465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.965474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.968222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-user-template-login\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.969737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5124644-4434-468a-be7c-d91ae9458450-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:57 crc kubenswrapper[4707]: I0218 05:49:57.977426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87z8k\" (UniqueName: \"kubernetes.io/projected/c5124644-4434-468a-be7c-d91ae9458450-kube-api-access-87z8k\") pod \"oauth-openshift-6f67d677dd-mntdx\" (UID: \"c5124644-4434-468a-be7c-d91ae9458450\") " pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.121922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.599669 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f67d677dd-mntdx"] Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.669546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" event={"ID":"ec70ffb1-f091-47ed-b947-6af13fd6d34f","Type":"ContainerDied","Data":"1d5ac31ec1eb769cca3f84edeef693be6d718b79935d903df69a6bfda6e0c3ff"} Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.669619 4707 scope.go:117] "RemoveContainer" containerID="ee367f57fe225be18c2d2923000ffd6d9b1f3315dc09041d9e525b02ec090edf" Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.669767 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cbn6t" Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.674358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" event={"ID":"c5124644-4434-468a-be7c-d91ae9458450","Type":"ContainerStarted","Data":"a9991b1536899a243eb8a69ebe191fbebbada44b47342301e9505eafe5cef24a"} Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.734055 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cbn6t"] Feb 18 05:49:58 crc kubenswrapper[4707]: I0218 05:49:58.736853 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cbn6t"] Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.443562 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.444834 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.445051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.445349 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e" gracePeriod=15 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.445417 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61" gracePeriod=15 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.445536 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e" gracePeriod=15 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.445565 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7" gracePeriod=15 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.445616 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d" gracePeriod=15 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.447474 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:49:59 crc kubenswrapper[4707]: E0218 05:49:59.447857 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.447895 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 05:49:59 crc kubenswrapper[4707]: E0218 05:49:59.447921 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.447960 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 05:49:59 crc kubenswrapper[4707]: E0218 05:49:59.447985 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448000 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 05:49:59 crc kubenswrapper[4707]: E0218 05:49:59.448031 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448048 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:49:59 crc kubenswrapper[4707]: E0218 05:49:59.448076 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448092 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 05:49:59 crc kubenswrapper[4707]: E0218 05:49:59.448122 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448138 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448353 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448383 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448405 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448429 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.448451 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.586658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.587264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.587306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.587330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.587360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.587378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.587399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.587419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.685465 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.686834 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7" exitCode=0 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.686880 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61" exitCode=0 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.686893 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e" exitCode=0 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.686904 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d" exitCode=2 Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.688693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.688785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.688833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.688866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.688913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.688965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689269 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" event={"ID":"c5124644-4434-468a-be7c-d91ae9458450","Type":"ContainerStarted","Data":"e557323e7edd18de1cc6ec53fab260035a2ff5c5c94bca1073416b740392f84b"} Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.689842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:49:59 crc kubenswrapper[4707]: I0218 05:49:59.700280 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" Feb 18 05:50:00 crc kubenswrapper[4707]: I0218 05:50:00.068194 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec70ffb1-f091-47ed-b947-6af13fd6d34f" path="/var/lib/kubelet/pods/ec70ffb1-f091-47ed-b947-6af13fd6d34f/volumes" Feb 18 05:50:04 crc kubenswrapper[4707]: E0218 05:50:04.489746 4707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.490677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:04 crc kubenswrapper[4707]: E0218 05:50:04.518991 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18954147aba33dc0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 05:50:04.518243776 +0000 UTC m=+141.166202910,LastTimestamp:2026-02-18 05:50:04.518243776 +0000 UTC m=+141.166202910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.697762 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.698032 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.698262 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.700689 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.728245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"801a641d0108c9c395dbe910e60be721e61b2c012feca30b4dc0939921916641"} Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.729382 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" containerID="195136b7352dde7e4d4e2c868cb7d4b4fe92e6095e42b93d86c2c983aaa9ff12" exitCode=0 Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.729462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9cf16072-3e25-4a57-b95a-86cb7a8681ac","Type":"ContainerDied","Data":"195136b7352dde7e4d4e2c868cb7d4b4fe92e6095e42b93d86c2c983aaa9ff12"} Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.730094 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:04 crc kubenswrapper[4707]: I0218 05:50:04.730309 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:05 crc kubenswrapper[4707]: I0218 05:50:05.736602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c"} Feb 18 05:50:05 crc kubenswrapper[4707]: E0218 05:50:05.737241 4707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:05 crc kubenswrapper[4707]: I0218 05:50:05.737255 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:05 crc kubenswrapper[4707]: I0218 05:50:05.737568 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:05 crc kubenswrapper[4707]: I0218 05:50:05.973285 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:50:05 crc kubenswrapper[4707]: I0218 05:50:05.973963 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:05 crc kubenswrapper[4707]: I0218 05:50:05.974228 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.744552 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.744729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9cf16072-3e25-4a57-b95a-86cb7a8681ac","Type":"ContainerDied","Data":"91f9a1e1ae597b1b5ab9da75864b98b1dbd59a211bfd7348aa24816cd80c3985"} Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.745015 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91f9a1e1ae597b1b5ab9da75864b98b1dbd59a211bfd7348aa24816cd80c3985" Feb 18 05:50:06 crc kubenswrapper[4707]: E0218 05:50:06.744866 4707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.831412 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.832058 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.832553 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.832722 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:06 crc kubenswrapper[4707]: I0218 05:50:06.832945 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.751006 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.752598 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e" exitCode=0 Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.752649 4707 scope.go:117] "RemoveContainer" containerID="4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.752774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.766675 4707 scope.go:117] "RemoveContainer" containerID="cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.778622 4707 scope.go:117] "RemoveContainer" containerID="7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.793935 4707 scope.go:117] "RemoveContainer" containerID="d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.808025 4707 scope.go:117] "RemoveContainer" containerID="de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.821808 4707 scope.go:117] "RemoveContainer" containerID="75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.839014 4707 scope.go:117] "RemoveContainer" containerID="4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7" Feb 18 05:50:07 crc kubenswrapper[4707]: E0218 05:50:07.839717 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\": container with ID starting with 4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7 not found: ID does not exist" containerID="4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.839842 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7"} err="failed to get container status \"4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\": rpc error: code = NotFound desc = could not find container \"4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7\": container with ID starting with 4eddb45d7d736a8398d95f271db1de2c82ca4c5c08eacfdbcbed6f20951186d7 not found: ID does not exist" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.840258 4707 scope.go:117] "RemoveContainer" containerID="cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61" Feb 18 05:50:07 crc kubenswrapper[4707]: E0218 05:50:07.841325 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\": container with ID starting with cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61 not found: ID does not exist" containerID="cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.841369 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61"} err="failed to get container status \"cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\": rpc error: code = NotFound desc = could not find container \"cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61\": container with ID starting with cf3adbbb671874de9106ed3fc3db2f2889d0dbbff6c71046d68736dde5877e61 not found: ID does not exist" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.841398 4707 scope.go:117] "RemoveContainer" containerID="7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e" Feb 18 05:50:07 crc kubenswrapper[4707]: E0218 05:50:07.841704 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\": container with ID starting with 7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e not found: ID does not exist" containerID="7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.841728 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e"} err="failed to get container status \"7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\": rpc error: code = NotFound desc = could not find container \"7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e\": container with ID starting with 7ad2bc8dedc33e5cfc26e1e95b6af56fd39e52247cfa65b49165ec8bd353bd0e not found: ID does not exist" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.841744 4707 scope.go:117] "RemoveContainer" containerID="d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d" Feb 18 05:50:07 crc kubenswrapper[4707]: E0218 05:50:07.842152 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\": container with ID starting with d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d not found: ID does not exist" containerID="d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.842194 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d"} err="failed to get container status \"d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\": rpc error: code = NotFound desc = could not find container \"d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d\": container with ID starting with d39f506dd0067a774ceca9a5a88c0c00a2a99756ac0d00fc8aa44f1c60d0c85d not found: ID does not exist" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.842232 4707 scope.go:117] "RemoveContainer" containerID="de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e" Feb 18 05:50:07 crc kubenswrapper[4707]: E0218 05:50:07.842806 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\": container with ID starting with de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e not found: ID does not exist" containerID="de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.842842 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e"} err="failed to get container status \"de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\": rpc error: code = NotFound desc = could not find container \"de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e\": container with ID starting with de4025a7c2a7497954789e9a276b6cdefa02f3b4c023e025aac93711a3911f8e not found: ID does not exist" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.842892 4707 scope.go:117] "RemoveContainer" containerID="75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae" Feb 18 05:50:07 crc kubenswrapper[4707]: E0218 05:50:07.843340 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\": container with ID starting with 75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae not found: ID does not exist" containerID="75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae" Feb 18 05:50:07 crc kubenswrapper[4707]: I0218 05:50:07.843372 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae"} err="failed to get container status \"75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\": rpc error: code = NotFound desc = could not find container \"75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae\": container with ID starting with 75c541d094cc2c1e05ae11f72db30d544891787c97f66d37bc9d714dd26381ae not found: ID does not exist" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.105935 4707 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" volumeName="registry-storage" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.124635 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.125866 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.126453 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.126719 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.127015 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.127040 4707 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.127205 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="200ms" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.200914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-var-lock\") pod \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-var-lock" (OuterVolumeSpecName: "var-lock") pod "9cf16072-3e25-4a57-b95a-86cb7a8681ac" (UID: "9cf16072-3e25-4a57-b95a-86cb7a8681ac"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kubelet-dir\") pod \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kube-api-access\") pod \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\" (UID: \"9cf16072-3e25-4a57-b95a-86cb7a8681ac\") " Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9cf16072-3e25-4a57-b95a-86cb7a8681ac" (UID: "9cf16072-3e25-4a57-b95a-86cb7a8681ac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201605 4707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201622 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201633 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201644 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.201655 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.209343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9cf16072-3e25-4a57-b95a-86cb7a8681ac" (UID: "9cf16072-3e25-4a57-b95a-86cb7a8681ac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.268038 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.269028 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.304776 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf16072-3e25-4a57-b95a-86cb7a8681ac-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.328697 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="400ms" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.368553 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.369167 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: I0218 05:50:08.369630 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.502374 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18954147aba33dc0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 05:50:04.518243776 +0000 UTC m=+141.166202910,LastTimestamp:2026-02-18 05:50:04.518243776 +0000 UTC m=+141.166202910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 05:50:08 crc kubenswrapper[4707]: E0218 05:50:08.730376 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="800ms" Feb 18 05:50:09 crc kubenswrapper[4707]: E0218 05:50:09.531674 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="1.6s" Feb 18 05:50:10 crc kubenswrapper[4707]: I0218 05:50:10.065259 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 18 05:50:11 crc kubenswrapper[4707]: E0218 05:50:11.133173 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="3.2s" Feb 18 05:50:11 crc kubenswrapper[4707]: I0218 05:50:11.782906 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 05:50:11 crc kubenswrapper[4707]: I0218 05:50:11.782957 4707 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4" exitCode=1 Feb 18 05:50:11 crc kubenswrapper[4707]: I0218 05:50:11.782993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4"} Feb 18 05:50:11 crc kubenswrapper[4707]: I0218 05:50:11.783428 4707 scope.go:117] "RemoveContainer" containerID="41f45798868cbd0c6f60e972eecd6073edbeb5d95d492192a49ec1e9022701d4" Feb 18 05:50:11 crc kubenswrapper[4707]: I0218 05:50:11.784420 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:11 crc kubenswrapper[4707]: I0218 05:50:11.784831 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:11 crc kubenswrapper[4707]: I0218 05:50:11.785353 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:12 crc kubenswrapper[4707]: I0218 05:50:12.790582 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 05:50:12 crc kubenswrapper[4707]: I0218 05:50:12.790936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2b7375fce59783c2a1580db0c7998138cfac70c51dc92748d876c7649003d93b"} Feb 18 05:50:12 crc kubenswrapper[4707]: I0218 05:50:12.791989 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:12 crc kubenswrapper[4707]: I0218 05:50:12.792629 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:12 crc kubenswrapper[4707]: I0218 05:50:12.793067 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.052748 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.053436 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.053953 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.054389 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.070224 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.070249 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:13 crc kubenswrapper[4707]: E0218 05:50:13.070581 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.071194 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:13 crc kubenswrapper[4707]: W0218 05:50:13.099179 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-130a484ed4b4b6bdcbfe046f3b1b858a5bb6ecf57f209e21c75e8d8bbc41d420 WatchSource:0}: Error finding container 130a484ed4b4b6bdcbfe046f3b1b858a5bb6ecf57f209e21c75e8d8bbc41d420: Status 404 returned error can't find the container with id 130a484ed4b4b6bdcbfe046f3b1b858a5bb6ecf57f209e21c75e8d8bbc41d420 Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.614505 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.798915 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="5ea4dee2461ad63f6f1f82ad6346a74a21bcc939e7158406a42103063ae312b0" exitCode=0 Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.799009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"5ea4dee2461ad63f6f1f82ad6346a74a21bcc939e7158406a42103063ae312b0"} Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.799091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"130a484ed4b4b6bdcbfe046f3b1b858a5bb6ecf57f209e21c75e8d8bbc41d420"} Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.799685 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.799711 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.800068 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:13 crc kubenswrapper[4707]: E0218 05:50:13.800556 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.800776 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:13 crc kubenswrapper[4707]: I0218 05:50:13.801073 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.060022 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.060768 4707 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.061492 4707 status_manager.go:851] "Failed to get status for pod" podUID="c5124644-4434-468a-be7c-d91ae9458450" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6f67d677dd-mntdx\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.061872 4707 status_manager.go:851] "Failed to get status for pod" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.806884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d60a9c1d54738b37a0ec19cbed480f77a126fac1801d274891b3ed75e21b182"} Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.806927 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"61c463b7e87b75ba754338f4320b41819c83400130cc1fbab75c8c4e7502c40a"} Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.806939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"be3c90099acb5b0a23e89b5255c83ed7d9d5fa6ee488b2f0c4d47fb8e21b4b9c"} Feb 18 05:50:14 crc kubenswrapper[4707]: I0218 05:50:14.806951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"31b5205ab73b6b69e56db613a7b09acbc4779c71b405eb2fda1b555aa03b2fb8"} Feb 18 05:50:15 crc kubenswrapper[4707]: I0218 05:50:15.815039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3ddcfe061185c932b631be11596119f7c8151705dd92fb60aa119583d7d1a7f1"} Feb 18 05:50:15 crc kubenswrapper[4707]: I0218 05:50:15.815362 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:15 crc kubenswrapper[4707]: I0218 05:50:15.815279 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:15 crc kubenswrapper[4707]: I0218 05:50:15.815381 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:16 crc kubenswrapper[4707]: I0218 05:50:16.159004 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:50:16 crc kubenswrapper[4707]: I0218 05:50:16.164214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:50:18 crc kubenswrapper[4707]: I0218 05:50:18.072096 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:18 crc kubenswrapper[4707]: I0218 05:50:18.072133 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:18 crc kubenswrapper[4707]: I0218 05:50:18.078310 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.035283 4707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.382670 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.382743 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.854986 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.857929 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3ddcfe061185c932b631be11596119f7c8151705dd92fb60aa119583d7d1a7f1" exitCode=255 Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.857979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3ddcfe061185c932b631be11596119f7c8151705dd92fb60aa119583d7d1a7f1"} Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.858462 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.858506 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:21 crc kubenswrapper[4707]: I0218 05:50:21.865589 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.094167 4707 scope.go:117] "RemoveContainer" containerID="3ddcfe061185c932b631be11596119f7c8151705dd92fb60aa119583d7d1a7f1" Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.114091 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e45f533e-98c8-45bc-9499-40a05a59044c" Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.866020 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.867959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4b0ff2b69be12e6da096a4bd49b12bedaf058555abf47a3873051532fec8f1c4"} Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.868219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.868342 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.868379 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:22 crc kubenswrapper[4707]: I0218 05:50:22.871175 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e45f533e-98c8-45bc-9499-40a05a59044c" Feb 18 05:50:23 crc kubenswrapper[4707]: I0218 05:50:23.627496 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 05:50:23 crc kubenswrapper[4707]: I0218 05:50:23.875035 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:23 crc kubenswrapper[4707]: I0218 05:50:23.875077 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="470ed6e7-c010-42bb-8889-a7a460f92c43" Feb 18 05:50:23 crc kubenswrapper[4707]: I0218 05:50:23.878027 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e45f533e-98c8-45bc-9499-40a05a59044c" Feb 18 05:50:24 crc kubenswrapper[4707]: I0218 05:50:24.031463 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 05:50:24 crc kubenswrapper[4707]: I0218 05:50:24.924415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 05:50:25 crc kubenswrapper[4707]: I0218 05:50:25.531408 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 05:50:25 crc kubenswrapper[4707]: I0218 05:50:25.576659 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 05:50:26 crc kubenswrapper[4707]: I0218 05:50:26.059488 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 05:50:26 crc kubenswrapper[4707]: I0218 05:50:26.198137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 05:50:26 crc kubenswrapper[4707]: I0218 05:50:26.282217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 05:50:26 crc kubenswrapper[4707]: I0218 05:50:26.291289 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 05:50:26 crc kubenswrapper[4707]: I0218 05:50:26.554596 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.057501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.172704 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.394535 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.448676 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.554988 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.570064 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.706452 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.732908 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:50:27 crc kubenswrapper[4707]: I0218 05:50:27.757318 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.012407 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.043041 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.048492 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.137927 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.157887 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.249514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.256417 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.333708 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.334029 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.410539 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.411297 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.422968 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.423707 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.433835 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.453119 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.462138 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.531448 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.571472 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.579939 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.607814 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.620264 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.623004 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.633040 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.653429 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.712211 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.761589 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.780337 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.795317 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.854946 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 05:50:28 crc kubenswrapper[4707]: I0218 05:50:28.996319 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.025848 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.057549 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.089870 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.111915 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.119492 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.169149 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.176510 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.217316 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.303241 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.363548 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.377619 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.476508 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.545726 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.548791 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.566102 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.591771 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.621614 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.701206 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.714372 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.742122 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.744967 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.745416 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.748342 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.756201 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.762953 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.796295 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.815016 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.825368 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.874984 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 05:50:29 crc kubenswrapper[4707]: I0218 05:50:29.981106 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.021143 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.043156 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.053358 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.058229 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.107048 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.159761 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.242832 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.248154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.329092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.353729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.367131 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.432281 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.460093 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.479856 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.485243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.510604 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.521563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.561437 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.577430 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.577830 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6f67d677dd-mntdx" podStartSLOduration=58.577778859 podStartE2EDuration="58.577778859s" podCreationTimestamp="2026-02-18 05:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:50:22.167778036 +0000 UTC m=+158.815737180" watchObservedRunningTime="2026-02-18 05:50:30.577778859 +0000 UTC m=+167.225738003" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.582920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.582979 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.600971 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.600951729 podStartE2EDuration="9.600951729s" podCreationTimestamp="2026-02-18 05:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:50:30.600316952 +0000 UTC m=+167.248276106" watchObservedRunningTime="2026-02-18 05:50:30.600951729 +0000 UTC m=+167.248910873" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.717719 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.730456 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.779550 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.824225 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.830194 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.871225 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.899787 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.913402 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.935854 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.957338 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.969273 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.978279 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 05:50:30 crc kubenswrapper[4707]: I0218 05:50:30.981178 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.006182 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.015907 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.026016 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.080112 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.228323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.239583 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.270255 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.358364 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.362515 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.380130 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.393602 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.413447 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.419114 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.420724 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.440541 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.452581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.476351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.479371 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.487443 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.494768 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.619681 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.658916 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.715556 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.796276 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.866133 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.913148 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.936825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 05:50:31 crc kubenswrapper[4707]: I0218 05:50:31.992698 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.055732 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.099376 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.107050 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.197928 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.219491 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.221586 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.232033 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.250855 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.298525 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.337725 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.415301 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.419676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.420582 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.441622 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.481659 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.509037 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.518258 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.573274 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.610609 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.650475 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.676695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.705117 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.713090 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.748304 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.777886 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.849236 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.861847 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.881137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.882663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.952936 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 05:50:32 crc kubenswrapper[4707]: I0218 05:50:32.997744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.041953 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.070643 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.072887 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.077026 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.167114 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.205416 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.209108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.214476 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.215352 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.295636 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.365985 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.401084 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.501062 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.588354 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 05:50:33 crc kubenswrapper[4707]: I0218 05:50:33.997130 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 05:50:34 crc kubenswrapper[4707]: I0218 05:50:34.001629 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 05:50:34 crc kubenswrapper[4707]: I0218 05:50:34.052382 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 05:50:34 crc kubenswrapper[4707]: I0218 05:50:34.175044 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 05:50:34 crc kubenswrapper[4707]: I0218 05:50:34.188753 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 05:50:34 crc kubenswrapper[4707]: I0218 05:50:34.400484 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 05:50:35 crc kubenswrapper[4707]: I0218 05:50:35.019112 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 05:50:35 crc kubenswrapper[4707]: I0218 05:50:35.273934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 05:50:35 crc kubenswrapper[4707]: I0218 05:50:35.879366 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 05:50:35 crc kubenswrapper[4707]: I0218 05:50:35.894509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 05:50:36 crc kubenswrapper[4707]: I0218 05:50:36.300734 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 05:50:36 crc kubenswrapper[4707]: I0218 05:50:36.556434 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 05:50:37 crc kubenswrapper[4707]: I0218 05:50:37.335294 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 05:50:37 crc kubenswrapper[4707]: I0218 05:50:37.345014 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 05:50:37 crc kubenswrapper[4707]: I0218 05:50:37.614500 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 05:50:37 crc kubenswrapper[4707]: I0218 05:50:37.615990 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 05:50:37 crc kubenswrapper[4707]: I0218 05:50:37.947097 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 05:50:38 crc kubenswrapper[4707]: I0218 05:50:38.036852 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 05:50:38 crc kubenswrapper[4707]: I0218 05:50:38.048635 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 05:50:38 crc kubenswrapper[4707]: I0218 05:50:38.174330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 05:50:38 crc kubenswrapper[4707]: I0218 05:50:38.341419 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 05:50:38 crc kubenswrapper[4707]: I0218 05:50:38.562888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 05:50:38 crc kubenswrapper[4707]: I0218 05:50:38.841139 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 05:50:39 crc kubenswrapper[4707]: I0218 05:50:39.003493 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 05:50:39 crc kubenswrapper[4707]: I0218 05:50:39.055425 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 05:50:39 crc kubenswrapper[4707]: I0218 05:50:39.461909 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 05:50:39 crc kubenswrapper[4707]: I0218 05:50:39.519835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 05:50:39 crc kubenswrapper[4707]: I0218 05:50:39.910305 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 05:50:39 crc kubenswrapper[4707]: I0218 05:50:39.976114 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:50:40 crc kubenswrapper[4707]: I0218 05:50:40.305384 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 05:50:40 crc kubenswrapper[4707]: I0218 05:50:40.312364 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 05:50:40 crc kubenswrapper[4707]: I0218 05:50:40.505384 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 05:50:40 crc kubenswrapper[4707]: I0218 05:50:40.612859 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 05:50:40 crc kubenswrapper[4707]: I0218 05:50:40.716499 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 05:50:40 crc kubenswrapper[4707]: I0218 05:50:40.935437 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 05:50:41 crc kubenswrapper[4707]: I0218 05:50:41.002496 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 05:50:41 crc kubenswrapper[4707]: I0218 05:50:41.173646 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 05:50:41 crc kubenswrapper[4707]: I0218 05:50:41.489028 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 05:50:41 crc kubenswrapper[4707]: I0218 05:50:41.525561 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 05:50:42 crc kubenswrapper[4707]: I0218 05:50:42.355495 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 05:50:42 crc kubenswrapper[4707]: I0218 05:50:42.784137 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 05:50:42 crc kubenswrapper[4707]: I0218 05:50:42.845153 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 05:50:42 crc kubenswrapper[4707]: I0218 05:50:42.847310 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.274009 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.298314 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.298596 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c" gracePeriod=5 Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.543858 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.647271 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.688386 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.709400 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.731268 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 05:50:43 crc kubenswrapper[4707]: I0218 05:50:43.770589 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 05:50:44 crc kubenswrapper[4707]: I0218 05:50:44.351663 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:50:44 crc kubenswrapper[4707]: I0218 05:50:44.557804 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 05:50:44 crc kubenswrapper[4707]: I0218 05:50:44.957303 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 05:50:45 crc kubenswrapper[4707]: I0218 05:50:45.021531 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 05:50:45 crc kubenswrapper[4707]: I0218 05:50:45.311763 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 05:50:45 crc kubenswrapper[4707]: I0218 05:50:45.382619 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 05:50:45 crc kubenswrapper[4707]: I0218 05:50:45.428915 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 05:50:45 crc kubenswrapper[4707]: I0218 05:50:45.536062 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 05:50:45 crc kubenswrapper[4707]: I0218 05:50:45.686313 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 05:50:46 crc kubenswrapper[4707]: I0218 05:50:46.202806 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 05:50:46 crc kubenswrapper[4707]: I0218 05:50:46.263634 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 05:50:46 crc kubenswrapper[4707]: I0218 05:50:46.474067 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 05:50:46 crc kubenswrapper[4707]: I0218 05:50:46.693902 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 05:50:46 crc kubenswrapper[4707]: I0218 05:50:46.980927 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 05:50:47 crc kubenswrapper[4707]: I0218 05:50:47.576353 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 05:50:48 crc kubenswrapper[4707]: I0218 05:50:48.392288 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 05:50:48 crc kubenswrapper[4707]: I0218 05:50:48.430854 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 05:50:48 crc kubenswrapper[4707]: I0218 05:50:48.826433 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 05:50:48 crc kubenswrapper[4707]: I0218 05:50:48.903948 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 05:50:48 crc kubenswrapper[4707]: I0218 05:50:48.904045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.029716 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.034303 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.034385 4707 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c" exitCode=137 Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.034453 4707 scope.go:117] "RemoveContainer" containerID="6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.034495 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.051989 4707 scope.go:117] "RemoveContainer" containerID="6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c" Feb 18 05:50:49 crc kubenswrapper[4707]: E0218 05:50:49.052625 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c\": container with ID starting with 6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c not found: ID does not exist" containerID="6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.052685 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c"} err="failed to get container status \"6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c\": rpc error: code = NotFound desc = could not find container \"6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c\": container with ID starting with 6e437a18449630966ae88f9032ba83ff948789d9419b9e14654c09bb4b30c40c not found: ID does not exist" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.056906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.057029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.057394 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.057430 4707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.057451 4707 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.057469 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.066174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.158291 4707 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 05:50:49 crc kubenswrapper[4707]: I0218 05:50:49.274982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 05:50:50 crc kubenswrapper[4707]: I0218 05:50:50.060393 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 18 05:50:51 crc kubenswrapper[4707]: I0218 05:50:51.382677 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:50:51 crc kubenswrapper[4707]: I0218 05:50:51.382752 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.543888 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-74wxc"] Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.544711 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" podUID="24beed91-e86e-4dae-a372-ea06be0cefb9" containerName="controller-manager" containerID="cri-o://eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395" gracePeriod=30 Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.645957 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn"] Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.646235 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" podUID="fd8702f2-1cdf-48fb-ad08-e6f533cc8404" containerName="route-controller-manager" containerID="cri-o://a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228" gracePeriod=30 Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.862929 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.877075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-proxy-ca-bundles\") pod \"24beed91-e86e-4dae-a372-ea06be0cefb9\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.877139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-client-ca\") pod \"24beed91-e86e-4dae-a372-ea06be0cefb9\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.877193 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-config\") pod \"24beed91-e86e-4dae-a372-ea06be0cefb9\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.877216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6z8b\" (UniqueName: \"kubernetes.io/projected/24beed91-e86e-4dae-a372-ea06be0cefb9-kube-api-access-v6z8b\") pod \"24beed91-e86e-4dae-a372-ea06be0cefb9\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.877241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24beed91-e86e-4dae-a372-ea06be0cefb9-serving-cert\") pod \"24beed91-e86e-4dae-a372-ea06be0cefb9\" (UID: \"24beed91-e86e-4dae-a372-ea06be0cefb9\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.878015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "24beed91-e86e-4dae-a372-ea06be0cefb9" (UID: "24beed91-e86e-4dae-a372-ea06be0cefb9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.878174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-config" (OuterVolumeSpecName: "config") pod "24beed91-e86e-4dae-a372-ea06be0cefb9" (UID: "24beed91-e86e-4dae-a372-ea06be0cefb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.878438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-client-ca" (OuterVolumeSpecName: "client-ca") pod "24beed91-e86e-4dae-a372-ea06be0cefb9" (UID: "24beed91-e86e-4dae-a372-ea06be0cefb9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.889443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24beed91-e86e-4dae-a372-ea06be0cefb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24beed91-e86e-4dae-a372-ea06be0cefb9" (UID: "24beed91-e86e-4dae-a372-ea06be0cefb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.890258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24beed91-e86e-4dae-a372-ea06be0cefb9-kube-api-access-v6z8b" (OuterVolumeSpecName: "kube-api-access-v6z8b") pod "24beed91-e86e-4dae-a372-ea06be0cefb9" (UID: "24beed91-e86e-4dae-a372-ea06be0cefb9"). InnerVolumeSpecName "kube-api-access-v6z8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.959025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.977763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-client-ca\") pod \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.977915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-serving-cert\") pod \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.977951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7qps\" (UniqueName: \"kubernetes.io/projected/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-kube-api-access-p7qps\") pod \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.977976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-config\") pod \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\" (UID: \"fd8702f2-1cdf-48fb-ad08-e6f533cc8404\") " Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.978237 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.978269 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.978286 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6z8b\" (UniqueName: \"kubernetes.io/projected/24beed91-e86e-4dae-a372-ea06be0cefb9-kube-api-access-v6z8b\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.978305 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24beed91-e86e-4dae-a372-ea06be0cefb9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.978320 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24beed91-e86e-4dae-a372-ea06be0cefb9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.978683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-client-ca" (OuterVolumeSpecName: "client-ca") pod "fd8702f2-1cdf-48fb-ad08-e6f533cc8404" (UID: "fd8702f2-1cdf-48fb-ad08-e6f533cc8404"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.979140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-config" (OuterVolumeSpecName: "config") pod "fd8702f2-1cdf-48fb-ad08-e6f533cc8404" (UID: "fd8702f2-1cdf-48fb-ad08-e6f533cc8404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.982183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fd8702f2-1cdf-48fb-ad08-e6f533cc8404" (UID: "fd8702f2-1cdf-48fb-ad08-e6f533cc8404"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:51:14 crc kubenswrapper[4707]: I0218 05:51:14.991329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-kube-api-access-p7qps" (OuterVolumeSpecName: "kube-api-access-p7qps") pod "fd8702f2-1cdf-48fb-ad08-e6f533cc8404" (UID: "fd8702f2-1cdf-48fb-ad08-e6f533cc8404"). InnerVolumeSpecName "kube-api-access-p7qps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.079550 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.079586 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.079596 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7qps\" (UniqueName: \"kubernetes.io/projected/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-kube-api-access-p7qps\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.079606 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd8702f2-1cdf-48fb-ad08-e6f533cc8404-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.172205 4707 generic.go:334] "Generic (PLEG): container finished" podID="24beed91-e86e-4dae-a372-ea06be0cefb9" containerID="eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395" exitCode=0 Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.172259 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.172287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" event={"ID":"24beed91-e86e-4dae-a372-ea06be0cefb9","Type":"ContainerDied","Data":"eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395"} Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.172319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-74wxc" event={"ID":"24beed91-e86e-4dae-a372-ea06be0cefb9","Type":"ContainerDied","Data":"2ba40bd5c2d03883067411fa0a2a7b1eec4171f5681b028346eb84f13d223352"} Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.172339 4707 scope.go:117] "RemoveContainer" containerID="eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.174296 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd8702f2-1cdf-48fb-ad08-e6f533cc8404" containerID="a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228" exitCode=0 Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.174337 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.174352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" event={"ID":"fd8702f2-1cdf-48fb-ad08-e6f533cc8404","Type":"ContainerDied","Data":"a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228"} Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.174386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn" event={"ID":"fd8702f2-1cdf-48fb-ad08-e6f533cc8404","Type":"ContainerDied","Data":"f3f9073a48a59b19eacc1d029dda73dd1156128aeb7cff719ca2918f76b60375"} Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.200469 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-74wxc"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.200746 4707 scope.go:117] "RemoveContainer" containerID="eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395" Feb 18 05:51:15 crc kubenswrapper[4707]: E0218 05:51:15.201190 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395\": container with ID starting with eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395 not found: ID does not exist" containerID="eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.201246 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395"} err="failed to get container status \"eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395\": rpc error: code = NotFound desc = could not find container \"eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395\": container with ID starting with eb905b30f57c421b987d7d04d85e4c8d84bfc606815836135bfffd11689e0395 not found: ID does not exist" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.201283 4707 scope.go:117] "RemoveContainer" containerID="a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.210527 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-74wxc"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.218993 4707 scope.go:117] "RemoveContainer" containerID="a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219129 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-bvmnd"] Feb 18 05:51:15 crc kubenswrapper[4707]: E0218 05:51:15.219489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 05:51:15 crc kubenswrapper[4707]: E0218 05:51:15.219543 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" containerName="installer" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219552 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" containerName="installer" Feb 18 05:51:15 crc kubenswrapper[4707]: E0218 05:51:15.219563 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8702f2-1cdf-48fb-ad08-e6f533cc8404" containerName="route-controller-manager" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219572 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8702f2-1cdf-48fb-ad08-e6f533cc8404" containerName="route-controller-manager" Feb 18 05:51:15 crc kubenswrapper[4707]: E0218 05:51:15.219613 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24beed91-e86e-4dae-a372-ea06be0cefb9" containerName="controller-manager" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219623 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24beed91-e86e-4dae-a372-ea06be0cefb9" containerName="controller-manager" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219837 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8702f2-1cdf-48fb-ad08-e6f533cc8404" containerName="route-controller-manager" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219852 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24beed91-e86e-4dae-a372-ea06be0cefb9" containerName="controller-manager" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.219862 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf16072-3e25-4a57-b95a-86cb7a8681ac" containerName="installer" Feb 18 05:51:15 crc kubenswrapper[4707]: E0218 05:51:15.220273 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228\": container with ID starting with a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228 not found: ID does not exist" containerID="a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.220322 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228"} err="failed to get container status \"a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228\": rpc error: code = NotFound desc = could not find container \"a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228\": container with ID starting with a78bf1a049a79bcba839abf4e327de603056c5d9e618949f73f0d1c304178228 not found: ID does not exist" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.220495 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.223400 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.223724 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.229662 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.229726 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nbvhn"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.230390 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.230591 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.232922 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.234198 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.236680 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-bvmnd"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.242057 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.281455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a896b8-5434-4e24-8363-cc58013e92c4-serving-cert\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.281503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-client-ca\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.281536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-proxy-ca-bundles\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.281732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qp9t\" (UniqueName: \"kubernetes.io/projected/f4a896b8-5434-4e24-8363-cc58013e92c4-kube-api-access-9qp9t\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.281838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-config\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.289602 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.290245 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.292012 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.292059 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.292468 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.292615 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.292956 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.293597 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.315826 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a896b8-5434-4e24-8363-cc58013e92c4-serving-cert\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16362c77-d7a7-4ceb-972a-741c893087e5-config\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-client-ca\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-proxy-ca-bundles\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qp9t\" (UniqueName: \"kubernetes.io/projected/f4a896b8-5434-4e24-8363-cc58013e92c4-kube-api-access-9qp9t\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-config\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16362c77-d7a7-4ceb-972a-741c893087e5-serving-cert\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16362c77-d7a7-4ceb-972a-741c893087e5-client-ca\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.382926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktbv\" (UniqueName: \"kubernetes.io/projected/16362c77-d7a7-4ceb-972a-741c893087e5-kube-api-access-bktbv\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.383858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-client-ca\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.383896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-proxy-ca-bundles\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.384228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-config\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.386519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a896b8-5434-4e24-8363-cc58013e92c4-serving-cert\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.400333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qp9t\" (UniqueName: \"kubernetes.io/projected/f4a896b8-5434-4e24-8363-cc58013e92c4-kube-api-access-9qp9t\") pod \"controller-manager-899f8988f-bvmnd\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.484511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16362c77-d7a7-4ceb-972a-741c893087e5-config\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.484629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16362c77-d7a7-4ceb-972a-741c893087e5-serving-cert\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.484658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16362c77-d7a7-4ceb-972a-741c893087e5-client-ca\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.484683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktbv\" (UniqueName: \"kubernetes.io/projected/16362c77-d7a7-4ceb-972a-741c893087e5-kube-api-access-bktbv\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.485501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16362c77-d7a7-4ceb-972a-741c893087e5-client-ca\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.485693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16362c77-d7a7-4ceb-972a-741c893087e5-config\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.487765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16362c77-d7a7-4ceb-972a-741c893087e5-serving-cert\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.500927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktbv\" (UniqueName: \"kubernetes.io/projected/16362c77-d7a7-4ceb-972a-741c893087e5-kube-api-access-bktbv\") pod \"route-controller-manager-94686984c-2jxt6\" (UID: \"16362c77-d7a7-4ceb-972a-741c893087e5\") " pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.542892 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.603287 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.705626 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-bvmnd"] Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.798656 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-bvmnd"] Feb 18 05:51:15 crc kubenswrapper[4707]: W0218 05:51:15.812996 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a896b8_5434_4e24_8363_cc58013e92c4.slice/crio-dd7422d6413ee7ec3c6d7613d53ef525b8f34bef1c6cbd759abc0a45f3f5ab26 WatchSource:0}: Error finding container dd7422d6413ee7ec3c6d7613d53ef525b8f34bef1c6cbd759abc0a45f3f5ab26: Status 404 returned error can't find the container with id dd7422d6413ee7ec3c6d7613d53ef525b8f34bef1c6cbd759abc0a45f3f5ab26 Feb 18 05:51:15 crc kubenswrapper[4707]: I0218 05:51:15.897849 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6"] Feb 18 05:51:15 crc kubenswrapper[4707]: W0218 05:51:15.903649 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16362c77_d7a7_4ceb_972a_741c893087e5.slice/crio-879ab6066dc037ef962b35cab245866f45f848992d284e2871b767603cfedad7 WatchSource:0}: Error finding container 879ab6066dc037ef962b35cab245866f45f848992d284e2871b767603cfedad7: Status 404 returned error can't find the container with id 879ab6066dc037ef962b35cab245866f45f848992d284e2871b767603cfedad7 Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.059529 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24beed91-e86e-4dae-a372-ea06be0cefb9" path="/var/lib/kubelet/pods/24beed91-e86e-4dae-a372-ea06be0cefb9/volumes" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.060496 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8702f2-1cdf-48fb-ad08-e6f533cc8404" path="/var/lib/kubelet/pods/fd8702f2-1cdf-48fb-ad08-e6f533cc8404/volumes" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.180439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" event={"ID":"16362c77-d7a7-4ceb-972a-741c893087e5","Type":"ContainerStarted","Data":"88d4dac4be2b34fd74b87092be14591f1a4d30a7d527efc811ad2ebe5be348ce"} Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.180478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" event={"ID":"16362c77-d7a7-4ceb-972a-741c893087e5","Type":"ContainerStarted","Data":"879ab6066dc037ef962b35cab245866f45f848992d284e2871b767603cfedad7"} Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.180804 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.182514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" event={"ID":"f4a896b8-5434-4e24-8363-cc58013e92c4","Type":"ContainerStarted","Data":"2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a"} Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.182549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" event={"ID":"f4a896b8-5434-4e24-8363-cc58013e92c4","Type":"ContainerStarted","Data":"dd7422d6413ee7ec3c6d7613d53ef525b8f34bef1c6cbd759abc0a45f3f5ab26"} Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.182617 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" podUID="f4a896b8-5434-4e24-8363-cc58013e92c4" containerName="controller-manager" containerID="cri-o://2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a" gracePeriod=30 Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.182723 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.189240 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.203031 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" podStartSLOduration=1.203008331 podStartE2EDuration="1.203008331s" podCreationTimestamp="2026-02-18 05:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:16.201505948 +0000 UTC m=+212.849465082" watchObservedRunningTime="2026-02-18 05:51:16.203008331 +0000 UTC m=+212.850967465" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.217726 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" podStartSLOduration=1.217707263 podStartE2EDuration="1.217707263s" podCreationTimestamp="2026-02-18 05:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:16.213900824 +0000 UTC m=+212.861859958" watchObservedRunningTime="2026-02-18 05:51:16.217707263 +0000 UTC m=+212.865666397" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.471274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.495155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qp9t\" (UniqueName: \"kubernetes.io/projected/f4a896b8-5434-4e24-8363-cc58013e92c4-kube-api-access-9qp9t\") pod \"f4a896b8-5434-4e24-8363-cc58013e92c4\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.495256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-client-ca\") pod \"f4a896b8-5434-4e24-8363-cc58013e92c4\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.495340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-config\") pod \"f4a896b8-5434-4e24-8363-cc58013e92c4\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.495368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-proxy-ca-bundles\") pod \"f4a896b8-5434-4e24-8363-cc58013e92c4\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.495399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a896b8-5434-4e24-8363-cc58013e92c4-serving-cert\") pod \"f4a896b8-5434-4e24-8363-cc58013e92c4\" (UID: \"f4a896b8-5434-4e24-8363-cc58013e92c4\") " Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.497217 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "f4a896b8-5434-4e24-8363-cc58013e92c4" (UID: "f4a896b8-5434-4e24-8363-cc58013e92c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.498683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f4a896b8-5434-4e24-8363-cc58013e92c4" (UID: "f4a896b8-5434-4e24-8363-cc58013e92c4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.498781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-config" (OuterVolumeSpecName: "config") pod "f4a896b8-5434-4e24-8363-cc58013e92c4" (UID: "f4a896b8-5434-4e24-8363-cc58013e92c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.501370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a896b8-5434-4e24-8363-cc58013e92c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f4a896b8-5434-4e24-8363-cc58013e92c4" (UID: "f4a896b8-5434-4e24-8363-cc58013e92c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.501949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a896b8-5434-4e24-8363-cc58013e92c4-kube-api-access-9qp9t" (OuterVolumeSpecName: "kube-api-access-9qp9t") pod "f4a896b8-5434-4e24-8363-cc58013e92c4" (UID: "f4a896b8-5434-4e24-8363-cc58013e92c4"). InnerVolumeSpecName "kube-api-access-9qp9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.596468 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qp9t\" (UniqueName: \"kubernetes.io/projected/f4a896b8-5434-4e24-8363-cc58013e92c4-kube-api-access-9qp9t\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.596503 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.596513 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.596523 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4a896b8-5434-4e24-8363-cc58013e92c4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.596533 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a896b8-5434-4e24-8363-cc58013e92c4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:16 crc kubenswrapper[4707]: I0218 05:51:16.716621 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-94686984c-2jxt6" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.187967 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4a896b8-5434-4e24-8363-cc58013e92c4" containerID="2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a" exitCode=0 Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.188695 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.192848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" event={"ID":"f4a896b8-5434-4e24-8363-cc58013e92c4","Type":"ContainerDied","Data":"2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a"} Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.192885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-899f8988f-bvmnd" event={"ID":"f4a896b8-5434-4e24-8363-cc58013e92c4","Type":"ContainerDied","Data":"dd7422d6413ee7ec3c6d7613d53ef525b8f34bef1c6cbd759abc0a45f3f5ab26"} Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.192905 4707 scope.go:117] "RemoveContainer" containerID="2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.207674 4707 scope.go:117] "RemoveContainer" containerID="2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a" Feb 18 05:51:17 crc kubenswrapper[4707]: E0218 05:51:17.208281 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a\": container with ID starting with 2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a not found: ID does not exist" containerID="2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.208349 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a"} err="failed to get container status \"2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a\": rpc error: code = NotFound desc = could not find container \"2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a\": container with ID starting with 2e7cf0d6318e2dee4444bd9b9a33a184c32befff38a3f12779f08005365cb28a not found: ID does not exist" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.215714 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-bvmnd"] Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.219595 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-bvmnd"] Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.223947 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-nnzsq"] Feb 18 05:51:17 crc kubenswrapper[4707]: E0218 05:51:17.224145 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a896b8-5434-4e24-8363-cc58013e92c4" containerName="controller-manager" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.224163 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a896b8-5434-4e24-8363-cc58013e92c4" containerName="controller-manager" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.224253 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a896b8-5434-4e24-8363-cc58013e92c4" containerName="controller-manager" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.224630 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.226496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.226610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.226952 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.227229 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.227238 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.228202 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.232473 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-nnzsq"] Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.233071 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.304302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qt8\" (UniqueName: \"kubernetes.io/projected/52a0b51e-c0d5-4536-9537-b0cee22b7a48-kube-api-access-s9qt8\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.304352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-client-ca\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.304390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-config\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.304458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-proxy-ca-bundles\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.304521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52a0b51e-c0d5-4536-9537-b0cee22b7a48-serving-cert\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.405415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52a0b51e-c0d5-4536-9537-b0cee22b7a48-serving-cert\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.405500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qt8\" (UniqueName: \"kubernetes.io/projected/52a0b51e-c0d5-4536-9537-b0cee22b7a48-kube-api-access-s9qt8\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.405529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-client-ca\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.405559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-config\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.405577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-proxy-ca-bundles\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.406818 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-client-ca\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.406898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-proxy-ca-bundles\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.407896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-config\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.410209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52a0b51e-c0d5-4536-9537-b0cee22b7a48-serving-cert\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.422072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qt8\" (UniqueName: \"kubernetes.io/projected/52a0b51e-c0d5-4536-9537-b0cee22b7a48-kube-api-access-s9qt8\") pod \"controller-manager-6799c5f44c-nnzsq\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.543413 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:17 crc kubenswrapper[4707]: I0218 05:51:17.925239 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-nnzsq"] Feb 18 05:51:18 crc kubenswrapper[4707]: I0218 05:51:18.070550 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a896b8-5434-4e24-8363-cc58013e92c4" path="/var/lib/kubelet/pods/f4a896b8-5434-4e24-8363-cc58013e92c4/volumes" Feb 18 05:51:18 crc kubenswrapper[4707]: I0218 05:51:18.194665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" event={"ID":"52a0b51e-c0d5-4536-9537-b0cee22b7a48","Type":"ContainerStarted","Data":"697ec5b0c8c989bf224b9984a670324789b4dc1d04d7b6d27cdf4a5436b89321"} Feb 18 05:51:18 crc kubenswrapper[4707]: I0218 05:51:18.194720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" event={"ID":"52a0b51e-c0d5-4536-9537-b0cee22b7a48","Type":"ContainerStarted","Data":"ef0304b1bdcb63cc02ce8c962faac92f8019db5b977855e022d2c74ea65f6b16"} Feb 18 05:51:18 crc kubenswrapper[4707]: I0218 05:51:18.209464 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" podStartSLOduration=3.209433069 podStartE2EDuration="3.209433069s" podCreationTimestamp="2026-02-18 05:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:18.208565484 +0000 UTC m=+214.856524638" watchObservedRunningTime="2026-02-18 05:51:18.209433069 +0000 UTC m=+214.857392203" Feb 18 05:51:19 crc kubenswrapper[4707]: I0218 05:51:19.198281 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:19 crc kubenswrapper[4707]: I0218 05:51:19.203298 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:21 crc kubenswrapper[4707]: I0218 05:51:21.382435 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:51:21 crc kubenswrapper[4707]: I0218 05:51:21.382840 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:51:21 crc kubenswrapper[4707]: I0218 05:51:21.382904 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:51:21 crc kubenswrapper[4707]: I0218 05:51:21.383524 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb616e6ef6d668e0c27124102abb5c64f761976e7550e81d8eb8d94a07fb5fd4"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 05:51:21 crc kubenswrapper[4707]: I0218 05:51:21.383590 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://bb616e6ef6d668e0c27124102abb5c64f761976e7550e81d8eb8d94a07fb5fd4" gracePeriod=600 Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.213840 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="bb616e6ef6d668e0c27124102abb5c64f761976e7550e81d8eb8d94a07fb5fd4" exitCode=0 Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.213911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"bb616e6ef6d668e0c27124102abb5c64f761976e7550e81d8eb8d94a07fb5fd4"} Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.214178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"c30aded239e5b6a5a3a43a43d3d8062408ccb46a9109591c1b4a41345a3dce40"} Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.417197 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5kz9v"] Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.419815 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.431755 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5kz9v"] Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.561680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c84p\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-kube-api-access-8c84p\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.561735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.561759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/987e0f4b-39fd-4d0c-beb9-be613dd005bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.561777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-bound-sa-token\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.561935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/987e0f4b-39fd-4d0c-beb9-be613dd005bc-registry-certificates\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.562014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-registry-tls\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.562039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/987e0f4b-39fd-4d0c-beb9-be613dd005bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.562240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/987e0f4b-39fd-4d0c-beb9-be613dd005bc-trusted-ca\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.583044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.664252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/987e0f4b-39fd-4d0c-beb9-be613dd005bc-registry-certificates\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.664340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-registry-tls\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.664371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/987e0f4b-39fd-4d0c-beb9-be613dd005bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.664398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/987e0f4b-39fd-4d0c-beb9-be613dd005bc-trusted-ca\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.664433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c84p\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-kube-api-access-8c84p\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.664461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/987e0f4b-39fd-4d0c-beb9-be613dd005bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.664499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-bound-sa-token\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.665034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/987e0f4b-39fd-4d0c-beb9-be613dd005bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.665782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/987e0f4b-39fd-4d0c-beb9-be613dd005bc-trusted-ca\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.665815 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/987e0f4b-39fd-4d0c-beb9-be613dd005bc-registry-certificates\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.679453 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/987e0f4b-39fd-4d0c-beb9-be613dd005bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.679538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-registry-tls\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.681699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-bound-sa-token\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.683712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c84p\" (UniqueName: \"kubernetes.io/projected/987e0f4b-39fd-4d0c-beb9-be613dd005bc-kube-api-access-8c84p\") pod \"image-registry-66df7c8f76-5kz9v\" (UID: \"987e0f4b-39fd-4d0c-beb9-be613dd005bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:22 crc kubenswrapper[4707]: I0218 05:51:22.743757 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:23 crc kubenswrapper[4707]: I0218 05:51:23.162346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5kz9v"] Feb 18 05:51:23 crc kubenswrapper[4707]: W0218 05:51:23.165523 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987e0f4b_39fd_4d0c_beb9_be613dd005bc.slice/crio-b1c51f78cdfd2dee418281bcf86cef94e5905c5d3bf29b0b088e78b2b0a46d65 WatchSource:0}: Error finding container b1c51f78cdfd2dee418281bcf86cef94e5905c5d3bf29b0b088e78b2b0a46d65: Status 404 returned error can't find the container with id b1c51f78cdfd2dee418281bcf86cef94e5905c5d3bf29b0b088e78b2b0a46d65 Feb 18 05:51:23 crc kubenswrapper[4707]: I0218 05:51:23.221749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" event={"ID":"987e0f4b-39fd-4d0c-beb9-be613dd005bc","Type":"ContainerStarted","Data":"b1c51f78cdfd2dee418281bcf86cef94e5905c5d3bf29b0b088e78b2b0a46d65"} Feb 18 05:51:24 crc kubenswrapper[4707]: I0218 05:51:24.227238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" event={"ID":"987e0f4b-39fd-4d0c-beb9-be613dd005bc","Type":"ContainerStarted","Data":"36d6b90261de979d23c653dd2692fca1422b6fe352dc38e810f9ae03e8a6bf65"} Feb 18 05:51:24 crc kubenswrapper[4707]: I0218 05:51:24.227589 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:24 crc kubenswrapper[4707]: I0218 05:51:24.250430 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" podStartSLOduration=2.2504112960000002 podStartE2EDuration="2.250411296s" podCreationTimestamp="2026-02-18 05:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:24.250324672 +0000 UTC m=+220.898283806" watchObservedRunningTime="2026-02-18 05:51:24.250411296 +0000 UTC m=+220.898370440" Feb 18 05:51:42 crc kubenswrapper[4707]: I0218 05:51:42.749716 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5kz9v" Feb 18 05:51:42 crc kubenswrapper[4707]: I0218 05:51:42.813582 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r74s4"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.418402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-prbjm"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.419400 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-prbjm" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="registry-server" containerID="cri-o://1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7" gracePeriod=30 Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.425048 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5pvbq"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.425251 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5pvbq" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="registry-server" containerID="cri-o://f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3" gracePeriod=30 Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.431956 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g8pt7"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.433126 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" podUID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" containerName="marketplace-operator" containerID="cri-o://d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa" gracePeriod=30 Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.451665 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bx4t"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.451984 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4bx4t" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="registry-server" containerID="cri-o://40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5" gracePeriod=30 Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.467227 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7xmc"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.468070 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.476945 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k69vm"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.477200 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k69vm" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="registry-server" containerID="cri-o://31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd" gracePeriod=30 Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.480922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7xmc"] Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.630510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.630954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.631019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qsg2\" (UniqueName: \"kubernetes.io/projected/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-kube-api-access-4qsg2\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.732122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.732176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.732227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qsg2\" (UniqueName: \"kubernetes.io/projected/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-kube-api-access-4qsg2\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.733640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.741628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.747214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qsg2\" (UniqueName: \"kubernetes.io/projected/ef5bee5f-c0c3-471e-88fb-43735b7c0b31-kube-api-access-4qsg2\") pod \"marketplace-operator-79b997595-z7xmc\" (UID: \"ef5bee5f-c0c3-471e-88fb-43735b7c0b31\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.790643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.906950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.936383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-operator-metrics\") pod \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.936474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztmsn\" (UniqueName: \"kubernetes.io/projected/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-kube-api-access-ztmsn\") pod \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.936575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-trusted-ca\") pod \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\" (UID: \"6c734b6c-4efa-4755-b79c-2eb9d132ebcb\") " Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.937682 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6c734b6c-4efa-4755-b79c-2eb9d132ebcb" (UID: "6c734b6c-4efa-4755-b79c-2eb9d132ebcb"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.941345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6c734b6c-4efa-4755-b79c-2eb9d132ebcb" (UID: "6c734b6c-4efa-4755-b79c-2eb9d132ebcb"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:51:49 crc kubenswrapper[4707]: I0218 05:51:49.942166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-kube-api-access-ztmsn" (OuterVolumeSpecName: "kube-api-access-ztmsn") pod "6c734b6c-4efa-4755-b79c-2eb9d132ebcb" (UID: "6c734b6c-4efa-4755-b79c-2eb9d132ebcb"). InnerVolumeSpecName "kube-api-access-ztmsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.037837 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.038173 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.038187 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztmsn\" (UniqueName: \"kubernetes.io/projected/6c734b6c-4efa-4755-b79c-2eb9d132ebcb-kube-api-access-ztmsn\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.103189 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.108896 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.110896 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.131519 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnpgm\" (UniqueName: \"kubernetes.io/projected/c2a7cf05-4af5-406e-8395-75b3634484e9-kube-api-access-tnpgm\") pod \"c2a7cf05-4af5-406e-8395-75b3634484e9\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-utilities\") pod \"065bc74d-6afe-4b4b-83a6-494643b467d7\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239714 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-catalog-content\") pod \"37ed8460-3a60-4ec0-b074-69244d0a46cf\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-utilities\") pod \"37ed8460-3a60-4ec0-b074-69244d0a46cf\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9llf\" (UniqueName: \"kubernetes.io/projected/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-kube-api-access-s9llf\") pod \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6bxv\" (UniqueName: \"kubernetes.io/projected/37ed8460-3a60-4ec0-b074-69244d0a46cf-kube-api-access-q6bxv\") pod \"37ed8460-3a60-4ec0-b074-69244d0a46cf\" (UID: \"37ed8460-3a60-4ec0-b074-69244d0a46cf\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-catalog-content\") pod \"c2a7cf05-4af5-406e-8395-75b3634484e9\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-catalog-content\") pod \"065bc74d-6afe-4b4b-83a6-494643b467d7\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-utilities\") pod \"c2a7cf05-4af5-406e-8395-75b3634484e9\" (UID: \"c2a7cf05-4af5-406e-8395-75b3634484e9\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-utilities\") pod \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239937 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtqqv\" (UniqueName: \"kubernetes.io/projected/065bc74d-6afe-4b4b-83a6-494643b467d7-kube-api-access-xtqqv\") pod \"065bc74d-6afe-4b4b-83a6-494643b467d7\" (UID: \"065bc74d-6afe-4b4b-83a6-494643b467d7\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.239955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-catalog-content\") pod \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\" (UID: \"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a\") " Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.240642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-utilities" (OuterVolumeSpecName: "utilities") pod "37ed8460-3a60-4ec0-b074-69244d0a46cf" (UID: "37ed8460-3a60-4ec0-b074-69244d0a46cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.240978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-utilities" (OuterVolumeSpecName: "utilities") pod "7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" (UID: "7f2ce0f9-b604-492c-b06e-c7b91ad94b6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.241055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-utilities" (OuterVolumeSpecName: "utilities") pod "c2a7cf05-4af5-406e-8395-75b3634484e9" (UID: "c2a7cf05-4af5-406e-8395-75b3634484e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.242080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-utilities" (OuterVolumeSpecName: "utilities") pod "065bc74d-6afe-4b4b-83a6-494643b467d7" (UID: "065bc74d-6afe-4b4b-83a6-494643b467d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.243584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ed8460-3a60-4ec0-b074-69244d0a46cf-kube-api-access-q6bxv" (OuterVolumeSpecName: "kube-api-access-q6bxv") pod "37ed8460-3a60-4ec0-b074-69244d0a46cf" (UID: "37ed8460-3a60-4ec0-b074-69244d0a46cf"). InnerVolumeSpecName "kube-api-access-q6bxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.243735 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a7cf05-4af5-406e-8395-75b3634484e9-kube-api-access-tnpgm" (OuterVolumeSpecName: "kube-api-access-tnpgm") pod "c2a7cf05-4af5-406e-8395-75b3634484e9" (UID: "c2a7cf05-4af5-406e-8395-75b3634484e9"). InnerVolumeSpecName "kube-api-access-tnpgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.244592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-kube-api-access-s9llf" (OuterVolumeSpecName: "kube-api-access-s9llf") pod "7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" (UID: "7f2ce0f9-b604-492c-b06e-c7b91ad94b6a"). InnerVolumeSpecName "kube-api-access-s9llf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.246345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/065bc74d-6afe-4b4b-83a6-494643b467d7-kube-api-access-xtqqv" (OuterVolumeSpecName: "kube-api-access-xtqqv") pod "065bc74d-6afe-4b4b-83a6-494643b467d7" (UID: "065bc74d-6afe-4b4b-83a6-494643b467d7"). InnerVolumeSpecName "kube-api-access-xtqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.264101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2a7cf05-4af5-406e-8395-75b3634484e9" (UID: "c2a7cf05-4af5-406e-8395-75b3634484e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.306233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "065bc74d-6afe-4b4b-83a6-494643b467d7" (UID: "065bc74d-6afe-4b4b-83a6-494643b467d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.307095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37ed8460-3a60-4ec0-b074-69244d0a46cf" (UID: "37ed8460-3a60-4ec0-b074-69244d0a46cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.335218 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7xmc"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340772 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340807 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ed8460-3a60-4ec0-b074-69244d0a46cf-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340817 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9llf\" (UniqueName: \"kubernetes.io/projected/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-kube-api-access-s9llf\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340827 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6bxv\" (UniqueName: \"kubernetes.io/projected/37ed8460-3a60-4ec0-b074-69244d0a46cf-kube-api-access-q6bxv\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340835 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340843 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340851 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a7cf05-4af5-406e-8395-75b3634484e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340858 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340867 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtqqv\" (UniqueName: \"kubernetes.io/projected/065bc74d-6afe-4b4b-83a6-494643b467d7-kube-api-access-xtqqv\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340875 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnpgm\" (UniqueName: \"kubernetes.io/projected/c2a7cf05-4af5-406e-8395-75b3634484e9-kube-api-access-tnpgm\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.340884 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065bc74d-6afe-4b4b-83a6-494643b467d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.361765 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerID="31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd" exitCode=0 Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.361827 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k69vm" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.361840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k69vm" event={"ID":"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a","Type":"ContainerDied","Data":"31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.362343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k69vm" event={"ID":"7f2ce0f9-b604-492c-b06e-c7b91ad94b6a","Type":"ContainerDied","Data":"6f66e4905eccaf215893167932a35b55110b4b94ca455c5c7493005cef95c107"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.362373 4707 scope.go:117] "RemoveContainer" containerID="31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.363698 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerID="40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5" exitCode=0 Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.363768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bx4t" event={"ID":"c2a7cf05-4af5-406e-8395-75b3634484e9","Type":"ContainerDied","Data":"40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.363813 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bx4t" event={"ID":"c2a7cf05-4af5-406e-8395-75b3634484e9","Type":"ContainerDied","Data":"5b651ab6ad623b27a73275b2d42f410ae1ecc02c93d710a459511531f5cb922a"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.363945 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bx4t" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.377642 4707 generic.go:334] "Generic (PLEG): container finished" podID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerID="f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3" exitCode=0 Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.377682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pvbq" event={"ID":"37ed8460-3a60-4ec0-b074-69244d0a46cf","Type":"ContainerDied","Data":"f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.377710 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pvbq" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.377722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pvbq" event={"ID":"37ed8460-3a60-4ec0-b074-69244d0a46cf","Type":"ContainerDied","Data":"bfed1abc20e4a3900fe3cf218d7787cc511569c90d1bc8df85591940fa740395"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.378289 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" (UID: "7f2ce0f9-b604-492c-b06e-c7b91ad94b6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.380238 4707 scope.go:117] "RemoveContainer" containerID="5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.385420 4707 generic.go:334] "Generic (PLEG): container finished" podID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerID="1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7" exitCode=0 Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.385448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbjm" event={"ID":"065bc74d-6afe-4b4b-83a6-494643b467d7","Type":"ContainerDied","Data":"1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.385482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbjm" event={"ID":"065bc74d-6afe-4b4b-83a6-494643b467d7","Type":"ContainerDied","Data":"0ecd4e337220e93fb4702b33042090c4f6ea010e6818fe6079f71df2354bd693"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.385563 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbjm" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.386826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" event={"ID":"ef5bee5f-c0c3-471e-88fb-43735b7c0b31","Type":"ContainerStarted","Data":"9d866dd648f70601260ac7b135993cacf3bc7e804958b16f1564c26249669aff"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.387973 4707 generic.go:334] "Generic (PLEG): container finished" podID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" containerID="d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa" exitCode=0 Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.388010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" event={"ID":"6c734b6c-4efa-4755-b79c-2eb9d132ebcb","Type":"ContainerDied","Data":"d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.388027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" event={"ID":"6c734b6c-4efa-4755-b79c-2eb9d132ebcb","Type":"ContainerDied","Data":"fe601dff3cc18848f96966d10aa65189d5c5474ff069dcc4d1aca9658334335c"} Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.388091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g8pt7" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.397528 4707 scope.go:117] "RemoveContainer" containerID="0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.405426 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bx4t"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.408666 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bx4t"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.417027 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g8pt7"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.419920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g8pt7"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.425525 4707 scope.go:117] "RemoveContainer" containerID="31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.426706 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd\": container with ID starting with 31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd not found: ID does not exist" containerID="31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.426744 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd"} err="failed to get container status \"31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd\": rpc error: code = NotFound desc = could not find container \"31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd\": container with ID starting with 31b435c5b38bec5978a41f6172e60de29082cfd5ce8e139af4804d1e301c00dd not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.426772 4707 scope.go:117] "RemoveContainer" containerID="5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.427141 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d\": container with ID starting with 5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d not found: ID does not exist" containerID="5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.427174 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d"} err="failed to get container status \"5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d\": rpc error: code = NotFound desc = could not find container \"5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d\": container with ID starting with 5d2b364989bf6a5766160485fe0a2b6a2a849380b2644957a6e6c2f2377e135d not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.427198 4707 scope.go:117] "RemoveContainer" containerID="0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.427499 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b\": container with ID starting with 0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b not found: ID does not exist" containerID="0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.427535 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b"} err="failed to get container status \"0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b\": rpc error: code = NotFound desc = could not find container \"0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b\": container with ID starting with 0f86ffb6907d10245d7e2a3fd22beab413a76ab2a51d1976a66d90781345193b not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.427562 4707 scope.go:117] "RemoveContainer" containerID="40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.431611 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5pvbq"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.435228 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5pvbq"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.442339 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.442865 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-prbjm"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.448193 4707 scope.go:117] "RemoveContainer" containerID="2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.450208 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-prbjm"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.462404 4707 scope.go:117] "RemoveContainer" containerID="d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.478924 4707 scope.go:117] "RemoveContainer" containerID="40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.479320 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5\": container with ID starting with 40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5 not found: ID does not exist" containerID="40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.479354 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5"} err="failed to get container status \"40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5\": rpc error: code = NotFound desc = could not find container \"40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5\": container with ID starting with 40da8c549c7466376b80bc1dfaf0f082d579c72c3be625e303aee268746551a5 not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.479379 4707 scope.go:117] "RemoveContainer" containerID="2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.479720 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b\": container with ID starting with 2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b not found: ID does not exist" containerID="2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.479761 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b"} err="failed to get container status \"2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b\": rpc error: code = NotFound desc = could not find container \"2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b\": container with ID starting with 2d188732f62a98aac10e76f9964718c0e9becf1b19ae7995e0de70c4edb6bc4b not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.479783 4707 scope.go:117] "RemoveContainer" containerID="d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.480123 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a\": container with ID starting with d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a not found: ID does not exist" containerID="d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.480191 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a"} err="failed to get container status \"d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a\": rpc error: code = NotFound desc = could not find container \"d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a\": container with ID starting with d1597a12df5f215f77f01557251828c1462ee1366b6c831ee18a7641d100e28a not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.480228 4707 scope.go:117] "RemoveContainer" containerID="f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.491843 4707 scope.go:117] "RemoveContainer" containerID="c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.505371 4707 scope.go:117] "RemoveContainer" containerID="22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.522599 4707 scope.go:117] "RemoveContainer" containerID="f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.523300 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3\": container with ID starting with f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3 not found: ID does not exist" containerID="f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.523340 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3"} err="failed to get container status \"f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3\": rpc error: code = NotFound desc = could not find container \"f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3\": container with ID starting with f8dc7a9ffe6ada5b6fd1248b3d51392a6925f55984c6a4cdf0d6941a8c51cda3 not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.523371 4707 scope.go:117] "RemoveContainer" containerID="c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.523726 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded\": container with ID starting with c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded not found: ID does not exist" containerID="c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.523768 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded"} err="failed to get container status \"c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded\": rpc error: code = NotFound desc = could not find container \"c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded\": container with ID starting with c6337d4d398074f4de16ec29d8178cf4942af54e4266276ef61cbf73e6e98ded not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.523854 4707 scope.go:117] "RemoveContainer" containerID="22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.524324 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123\": container with ID starting with 22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123 not found: ID does not exist" containerID="22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.524349 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123"} err="failed to get container status \"22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123\": rpc error: code = NotFound desc = could not find container \"22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123\": container with ID starting with 22616dce90d15cff1c8233c9da008156da7154711988134c8af714879f4f5123 not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.524365 4707 scope.go:117] "RemoveContainer" containerID="1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.541140 4707 scope.go:117] "RemoveContainer" containerID="159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.607452 4707 scope.go:117] "RemoveContainer" containerID="9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.620312 4707 scope.go:117] "RemoveContainer" containerID="1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.620737 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7\": container with ID starting with 1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7 not found: ID does not exist" containerID="1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.620772 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7"} err="failed to get container status \"1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7\": rpc error: code = NotFound desc = could not find container \"1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7\": container with ID starting with 1e6bf6dd19e45137b1f9ccc9745b93849e705fabb901c808dc18d3dc2006f5e7 not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.620851 4707 scope.go:117] "RemoveContainer" containerID="159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.621078 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440\": container with ID starting with 159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440 not found: ID does not exist" containerID="159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.621104 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440"} err="failed to get container status \"159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440\": rpc error: code = NotFound desc = could not find container \"159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440\": container with ID starting with 159776b921ba63e827013af64e12a41ba9127cf9f610d1e0cdc7ccfba4c25440 not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.621122 4707 scope.go:117] "RemoveContainer" containerID="9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.621461 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90\": container with ID starting with 9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90 not found: ID does not exist" containerID="9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.621491 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90"} err="failed to get container status \"9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90\": rpc error: code = NotFound desc = could not find container \"9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90\": container with ID starting with 9f316167390328f10161b134e8112b49c5d73454477f80f127ae83941aac7d90 not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.621506 4707 scope.go:117] "RemoveContainer" containerID="d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.637088 4707 scope.go:117] "RemoveContainer" containerID="d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa" Feb 18 05:51:50 crc kubenswrapper[4707]: E0218 05:51:50.637556 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa\": container with ID starting with d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa not found: ID does not exist" containerID="d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.637585 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa"} err="failed to get container status \"d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa\": rpc error: code = NotFound desc = could not find container \"d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa\": container with ID starting with d66a898574675b925c8fced8600a94a38e47f5bbc2e2d0b5601b0065dfaef7fa not found: ID does not exist" Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.694774 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k69vm"] Feb 18 05:51:50 crc kubenswrapper[4707]: I0218 05:51:50.699335 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k69vm"] Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.402184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" event={"ID":"ef5bee5f-c0c3-471e-88fb-43735b7c0b31","Type":"ContainerStarted","Data":"151baa900ac47f02ad782272dbab8eeea76f7fa9bfe01dae06f721bc5eec5b8d"} Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.403255 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.410646 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433244 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rsz6w"] Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433495 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433508 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433519 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433526 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433547 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433560 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433567 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433578 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433584 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433596 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433603 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433615 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433624 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433633 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433640 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="extract-utilities" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433650 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433656 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433665 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" containerName="marketplace-operator" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433672 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" containerName="marketplace-operator" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433683 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433690 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433699 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433707 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: E0218 05:51:51.433717 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433724 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="extract-content" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433887 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433903 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" containerName="marketplace-operator" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433928 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.433936 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" containerName="registry-server" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.434129 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z7xmc" podStartSLOduration=2.434107445 podStartE2EDuration="2.434107445s" podCreationTimestamp="2026-02-18 05:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:51.427409193 +0000 UTC m=+248.075368337" watchObservedRunningTime="2026-02-18 05:51:51.434107445 +0000 UTC m=+248.082066579" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.434758 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.437001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.449122 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsz6w"] Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.460726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3ae920-45c5-4b49-aed2-d651c4de9499-utilities\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.460823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3ae920-45c5-4b49-aed2-d651c4de9499-catalog-content\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.461047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmw8\" (UniqueName: \"kubernetes.io/projected/2c3ae920-45c5-4b49-aed2-d651c4de9499-kube-api-access-jsmw8\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.562239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmw8\" (UniqueName: \"kubernetes.io/projected/2c3ae920-45c5-4b49-aed2-d651c4de9499-kube-api-access-jsmw8\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.562294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3ae920-45c5-4b49-aed2-d651c4de9499-utilities\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.562317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3ae920-45c5-4b49-aed2-d651c4de9499-catalog-content\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.562746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3ae920-45c5-4b49-aed2-d651c4de9499-utilities\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.562829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3ae920-45c5-4b49-aed2-d651c4de9499-catalog-content\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.581896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmw8\" (UniqueName: \"kubernetes.io/projected/2c3ae920-45c5-4b49-aed2-d651c4de9499-kube-api-access-jsmw8\") pod \"certified-operators-rsz6w\" (UID: \"2c3ae920-45c5-4b49-aed2-d651c4de9499\") " pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:51 crc kubenswrapper[4707]: I0218 05:51:51.750975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.025635 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kptw6"] Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.026999 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.033065 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.037036 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kptw6"] Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.059534 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="065bc74d-6afe-4b4b-83a6-494643b467d7" path="/var/lib/kubelet/pods/065bc74d-6afe-4b4b-83a6-494643b467d7/volumes" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.060194 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ed8460-3a60-4ec0-b074-69244d0a46cf" path="/var/lib/kubelet/pods/37ed8460-3a60-4ec0-b074-69244d0a46cf/volumes" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.060762 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c734b6c-4efa-4755-b79c-2eb9d132ebcb" path="/var/lib/kubelet/pods/6c734b6c-4efa-4755-b79c-2eb9d132ebcb/volumes" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.061745 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2ce0f9-b604-492c-b06e-c7b91ad94b6a" path="/var/lib/kubelet/pods/7f2ce0f9-b604-492c-b06e-c7b91ad94b6a/volumes" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.062416 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a7cf05-4af5-406e-8395-75b3634484e9" path="/var/lib/kubelet/pods/c2a7cf05-4af5-406e-8395-75b3634484e9/volumes" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.124132 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsz6w"] Feb 18 05:51:52 crc kubenswrapper[4707]: W0218 05:51:52.128996 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3ae920_45c5_4b49_aed2_d651c4de9499.slice/crio-5f9ce2986fdffc7af73fa16340338ecbc55a9f3ef0fadd682c2ed48ce6037673 WatchSource:0}: Error finding container 5f9ce2986fdffc7af73fa16340338ecbc55a9f3ef0fadd682c2ed48ce6037673: Status 404 returned error can't find the container with id 5f9ce2986fdffc7af73fa16340338ecbc55a9f3ef0fadd682c2ed48ce6037673 Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.166689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c918c8de-d428-484d-910e-c513ed5db3b9-catalog-content\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.166747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2hmx\" (UniqueName: \"kubernetes.io/projected/c918c8de-d428-484d-910e-c513ed5db3b9-kube-api-access-l2hmx\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.166917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c918c8de-d428-484d-910e-c513ed5db3b9-utilities\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.267646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c918c8de-d428-484d-910e-c513ed5db3b9-catalog-content\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.267719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2hmx\" (UniqueName: \"kubernetes.io/projected/c918c8de-d428-484d-910e-c513ed5db3b9-kube-api-access-l2hmx\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.267787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c918c8de-d428-484d-910e-c513ed5db3b9-utilities\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.268378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c918c8de-d428-484d-910e-c513ed5db3b9-catalog-content\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.268540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c918c8de-d428-484d-910e-c513ed5db3b9-utilities\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.284728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2hmx\" (UniqueName: \"kubernetes.io/projected/c918c8de-d428-484d-910e-c513ed5db3b9-kube-api-access-l2hmx\") pod \"redhat-marketplace-kptw6\" (UID: \"c918c8de-d428-484d-910e-c513ed5db3b9\") " pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.350122 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.414965 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c3ae920-45c5-4b49-aed2-d651c4de9499" containerID="e0a207a83e8e4f4080cb85da90d7ee4d11b87b1e3d82dad666f1363a71055c18" exitCode=0 Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.415024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsz6w" event={"ID":"2c3ae920-45c5-4b49-aed2-d651c4de9499","Type":"ContainerDied","Data":"e0a207a83e8e4f4080cb85da90d7ee4d11b87b1e3d82dad666f1363a71055c18"} Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.415487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsz6w" event={"ID":"2c3ae920-45c5-4b49-aed2-d651c4de9499","Type":"ContainerStarted","Data":"5f9ce2986fdffc7af73fa16340338ecbc55a9f3ef0fadd682c2ed48ce6037673"} Feb 18 05:51:52 crc kubenswrapper[4707]: I0218 05:51:52.706325 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kptw6"] Feb 18 05:51:52 crc kubenswrapper[4707]: W0218 05:51:52.712652 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc918c8de_d428_484d_910e_c513ed5db3b9.slice/crio-a1ec49bf327fe6ca812bb9cba7719bb7a49dd895bfadeb8c2cfc787c7c14ec8e WatchSource:0}: Error finding container a1ec49bf327fe6ca812bb9cba7719bb7a49dd895bfadeb8c2cfc787c7c14ec8e: Status 404 returned error can't find the container with id a1ec49bf327fe6ca812bb9cba7719bb7a49dd895bfadeb8c2cfc787c7c14ec8e Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.430650 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c3ae920-45c5-4b49-aed2-d651c4de9499" containerID="4ddd0c89d72b402c47043478446ca1aa2d4a0a65ce341c85538a62fd7fd5d522" exitCode=0 Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.431040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsz6w" event={"ID":"2c3ae920-45c5-4b49-aed2-d651c4de9499","Type":"ContainerDied","Data":"4ddd0c89d72b402c47043478446ca1aa2d4a0a65ce341c85538a62fd7fd5d522"} Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.433174 4707 generic.go:334] "Generic (PLEG): container finished" podID="c918c8de-d428-484d-910e-c513ed5db3b9" containerID="ea92edfc2f5605753e6b10d6f9aafadd686804d1f0053c5017bcb94591e29b93" exitCode=0 Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.433229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kptw6" event={"ID":"c918c8de-d428-484d-910e-c513ed5db3b9","Type":"ContainerDied","Data":"ea92edfc2f5605753e6b10d6f9aafadd686804d1f0053c5017bcb94591e29b93"} Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.433298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kptw6" event={"ID":"c918c8de-d428-484d-910e-c513ed5db3b9","Type":"ContainerStarted","Data":"a1ec49bf327fe6ca812bb9cba7719bb7a49dd895bfadeb8c2cfc787c7c14ec8e"} Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.831154 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zprxl"] Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.834184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.841714 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zprxl"] Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.841850 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.886329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rspps\" (UniqueName: \"kubernetes.io/projected/9a253b57-9570-4ed9-8d7a-acb1733f9db2-kube-api-access-rspps\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.886619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a253b57-9570-4ed9-8d7a-acb1733f9db2-utilities\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.886647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a253b57-9570-4ed9-8d7a-acb1733f9db2-catalog-content\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.987163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rspps\" (UniqueName: \"kubernetes.io/projected/9a253b57-9570-4ed9-8d7a-acb1733f9db2-kube-api-access-rspps\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.987205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a253b57-9570-4ed9-8d7a-acb1733f9db2-utilities\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.987236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a253b57-9570-4ed9-8d7a-acb1733f9db2-catalog-content\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.987661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a253b57-9570-4ed9-8d7a-acb1733f9db2-catalog-content\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:53 crc kubenswrapper[4707]: I0218 05:51:53.987693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a253b57-9570-4ed9-8d7a-acb1733f9db2-utilities\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.008860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rspps\" (UniqueName: \"kubernetes.io/projected/9a253b57-9570-4ed9-8d7a-acb1733f9db2-kube-api-access-rspps\") pod \"redhat-operators-zprxl\" (UID: \"9a253b57-9570-4ed9-8d7a-acb1733f9db2\") " pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.020390 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-nnzsq"] Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.020582 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" podUID="52a0b51e-c0d5-4536-9537-b0cee22b7a48" containerName="controller-manager" containerID="cri-o://697ec5b0c8c989bf224b9984a670324789b4dc1d04d7b6d27cdf4a5436b89321" gracePeriod=30 Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.161754 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.428462 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b26jp"] Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.430879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.434394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.442852 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b26jp"] Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.445446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsz6w" event={"ID":"2c3ae920-45c5-4b49-aed2-d651c4de9499","Type":"ContainerStarted","Data":"c792ed2a1b5666728befec8c72e558e7fe41a70ccc37e2d7c786169a333fe7f6"} Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.450305 4707 generic.go:334] "Generic (PLEG): container finished" podID="c918c8de-d428-484d-910e-c513ed5db3b9" containerID="de018c05d43ce07b103b1fa0dacbf1e5ee41b5c9edd51c1c66ec89f0ebd369e4" exitCode=0 Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.450358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kptw6" event={"ID":"c918c8de-d428-484d-910e-c513ed5db3b9","Type":"ContainerDied","Data":"de018c05d43ce07b103b1fa0dacbf1e5ee41b5c9edd51c1c66ec89f0ebd369e4"} Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.454474 4707 generic.go:334] "Generic (PLEG): container finished" podID="52a0b51e-c0d5-4536-9537-b0cee22b7a48" containerID="697ec5b0c8c989bf224b9984a670324789b4dc1d04d7b6d27cdf4a5436b89321" exitCode=0 Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.454504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" event={"ID":"52a0b51e-c0d5-4536-9537-b0cee22b7a48","Type":"ContainerDied","Data":"697ec5b0c8c989bf224b9984a670324789b4dc1d04d7b6d27cdf4a5436b89321"} Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.462618 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rsz6w" podStartSLOduration=1.832355379 podStartE2EDuration="3.462599478s" podCreationTimestamp="2026-02-18 05:51:51 +0000 UTC" firstStartedPulling="2026-02-18 05:51:52.417266143 +0000 UTC m=+249.065225277" lastFinishedPulling="2026-02-18 05:51:54.047510242 +0000 UTC m=+250.695469376" observedRunningTime="2026-02-18 05:51:54.461884118 +0000 UTC m=+251.109843252" watchObservedRunningTime="2026-02-18 05:51:54.462599478 +0000 UTC m=+251.110558612" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.477134 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.546327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zprxl"] Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.594344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52a0b51e-c0d5-4536-9537-b0cee22b7a48-serving-cert\") pod \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.594442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9qt8\" (UniqueName: \"kubernetes.io/projected/52a0b51e-c0d5-4536-9537-b0cee22b7a48-kube-api-access-s9qt8\") pod \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.594499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-client-ca\") pod \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.594525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-proxy-ca-bundles\") pod \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.594552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-config\") pod \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\" (UID: \"52a0b51e-c0d5-4536-9537-b0cee22b7a48\") " Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.594721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-catalog-content\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.594780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9w9g\" (UniqueName: \"kubernetes.io/projected/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-kube-api-access-d9w9g\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.595317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "52a0b51e-c0d5-4536-9537-b0cee22b7a48" (UID: "52a0b51e-c0d5-4536-9537-b0cee22b7a48"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.595441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-client-ca" (OuterVolumeSpecName: "client-ca") pod "52a0b51e-c0d5-4536-9537-b0cee22b7a48" (UID: "52a0b51e-c0d5-4536-9537-b0cee22b7a48"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.595568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-utilities\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.595720 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.596212 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.596267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-config" (OuterVolumeSpecName: "config") pod "52a0b51e-c0d5-4536-9537-b0cee22b7a48" (UID: "52a0b51e-c0d5-4536-9537-b0cee22b7a48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.599059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a0b51e-c0d5-4536-9537-b0cee22b7a48-kube-api-access-s9qt8" (OuterVolumeSpecName: "kube-api-access-s9qt8") pod "52a0b51e-c0d5-4536-9537-b0cee22b7a48" (UID: "52a0b51e-c0d5-4536-9537-b0cee22b7a48"). InnerVolumeSpecName "kube-api-access-s9qt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.600684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a0b51e-c0d5-4536-9537-b0cee22b7a48-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52a0b51e-c0d5-4536-9537-b0cee22b7a48" (UID: "52a0b51e-c0d5-4536-9537-b0cee22b7a48"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-catalog-content\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9w9g\" (UniqueName: \"kubernetes.io/projected/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-kube-api-access-d9w9g\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-utilities\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697494 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a0b51e-c0d5-4536-9537-b0cee22b7a48-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697505 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52a0b51e-c0d5-4536-9537-b0cee22b7a48-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697516 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9qt8\" (UniqueName: \"kubernetes.io/projected/52a0b51e-c0d5-4536-9537-b0cee22b7a48-kube-api-access-s9qt8\") on node \"crc\" DevicePath \"\"" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-catalog-content\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.697865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-utilities\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.713091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9w9g\" (UniqueName: \"kubernetes.io/projected/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-kube-api-access-d9w9g\") pod \"community-operators-b26jp\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:54 crc kubenswrapper[4707]: I0218 05:51:54.753616 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.168551 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b26jp"] Feb 18 05:51:55 crc kubenswrapper[4707]: W0218 05:51:55.176462 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6f6f77_5e62_4186_bea4_19b13aa2a79a.slice/crio-45e3c61e1a9c18a32f5e29efce9d8cde4879ffb331e6076ddca8e5771642963c WatchSource:0}: Error finding container 45e3c61e1a9c18a32f5e29efce9d8cde4879ffb331e6076ddca8e5771642963c: Status 404 returned error can't find the container with id 45e3c61e1a9c18a32f5e29efce9d8cde4879ffb331e6076ddca8e5771642963c Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.245378 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-zlvlx"] Feb 18 05:51:55 crc kubenswrapper[4707]: E0218 05:51:55.245621 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a0b51e-c0d5-4536-9537-b0cee22b7a48" containerName="controller-manager" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.245641 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a0b51e-c0d5-4536-9537-b0cee22b7a48" containerName="controller-manager" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.245772 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a0b51e-c0d5-4536-9537-b0cee22b7a48" containerName="controller-manager" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.246246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.261308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-zlvlx"] Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.407882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-config\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.407960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ckzq\" (UniqueName: \"kubernetes.io/projected/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-kube-api-access-2ckzq\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.408013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-proxy-ca-bundles\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.408074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-serving-cert\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.408097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-client-ca\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.460838 4707 generic.go:334] "Generic (PLEG): container finished" podID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerID="b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee" exitCode=0 Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.460943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b26jp" event={"ID":"0f6f6f77-5e62-4186-bea4-19b13aa2a79a","Type":"ContainerDied","Data":"b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee"} Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.461008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b26jp" event={"ID":"0f6f6f77-5e62-4186-bea4-19b13aa2a79a","Type":"ContainerStarted","Data":"45e3c61e1a9c18a32f5e29efce9d8cde4879ffb331e6076ddca8e5771642963c"} Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.463303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kptw6" event={"ID":"c918c8de-d428-484d-910e-c513ed5db3b9","Type":"ContainerStarted","Data":"b1697990eb421eed3de76a6b2235d75adc3cd5a4032f5ff63623f4b673bb7853"} Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.464509 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a253b57-9570-4ed9-8d7a-acb1733f9db2" containerID="6f58741cf0c2f457969321539a1d88273db465d5c00061295815521c1d751626" exitCode=0 Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.464569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zprxl" event={"ID":"9a253b57-9570-4ed9-8d7a-acb1733f9db2","Type":"ContainerDied","Data":"6f58741cf0c2f457969321539a1d88273db465d5c00061295815521c1d751626"} Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.464592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zprxl" event={"ID":"9a253b57-9570-4ed9-8d7a-acb1733f9db2","Type":"ContainerStarted","Data":"d97a351deb23b296784f4aee6c248db6d902008ef1d9eda3591f06c204b6724e"} Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.468104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" event={"ID":"52a0b51e-c0d5-4536-9537-b0cee22b7a48","Type":"ContainerDied","Data":"ef0304b1bdcb63cc02ce8c962faac92f8019db5b977855e022d2c74ea65f6b16"} Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.468149 4707 scope.go:117] "RemoveContainer" containerID="697ec5b0c8c989bf224b9984a670324789b4dc1d04d7b6d27cdf4a5436b89321" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.468941 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-nnzsq" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.505808 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kptw6" podStartSLOduration=2.092754742 podStartE2EDuration="3.5057775s" podCreationTimestamp="2026-02-18 05:51:52 +0000 UTC" firstStartedPulling="2026-02-18 05:51:53.434318824 +0000 UTC m=+250.082277968" lastFinishedPulling="2026-02-18 05:51:54.847341592 +0000 UTC m=+251.495300726" observedRunningTime="2026-02-18 05:51:55.503293239 +0000 UTC m=+252.151252373" watchObservedRunningTime="2026-02-18 05:51:55.5057775 +0000 UTC m=+252.153736634" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.508965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-proxy-ca-bundles\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.509011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-serving-cert\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.509036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-client-ca\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.509119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-config\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.509152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ckzq\" (UniqueName: \"kubernetes.io/projected/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-kube-api-access-2ckzq\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.510071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-proxy-ca-bundles\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.510111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-client-ca\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.510455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-config\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.516736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-serving-cert\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.530858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ckzq\" (UniqueName: \"kubernetes.io/projected/bd66f8b8-85ae-4b37-9852-5733ee2f87e2-kube-api-access-2ckzq\") pod \"controller-manager-899f8988f-zlvlx\" (UID: \"bd66f8b8-85ae-4b37-9852-5733ee2f87e2\") " pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.543965 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-nnzsq"] Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.546658 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-nnzsq"] Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.586707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:55 crc kubenswrapper[4707]: I0218 05:51:55.761774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-899f8988f-zlvlx"] Feb 18 05:51:55 crc kubenswrapper[4707]: W0218 05:51:55.765505 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd66f8b8_85ae_4b37_9852_5733ee2f87e2.slice/crio-10153dae436d323a00d89066c87b34e0736ef1d876ab74aa6c3783be9e7ae4cb WatchSource:0}: Error finding container 10153dae436d323a00d89066c87b34e0736ef1d876ab74aa6c3783be9e7ae4cb: Status 404 returned error can't find the container with id 10153dae436d323a00d89066c87b34e0736ef1d876ab74aa6c3783be9e7ae4cb Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.078036 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a0b51e-c0d5-4536-9537-b0cee22b7a48" path="/var/lib/kubelet/pods/52a0b51e-c0d5-4536-9537-b0cee22b7a48/volumes" Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.475049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zprxl" event={"ID":"9a253b57-9570-4ed9-8d7a-acb1733f9db2","Type":"ContainerStarted","Data":"638cbd9919fd1108c41c70a6e3560fddcfd713e8f401c17cd74309302c9d90c3"} Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.478627 4707 generic.go:334] "Generic (PLEG): container finished" podID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerID="8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc" exitCode=0 Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.478703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b26jp" event={"ID":"0f6f6f77-5e62-4186-bea4-19b13aa2a79a","Type":"ContainerDied","Data":"8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc"} Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.484177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" event={"ID":"bd66f8b8-85ae-4b37-9852-5733ee2f87e2","Type":"ContainerStarted","Data":"365133d1cf46db60867c9ce2ca8032884f483eaab6282548e6a7ce261e1dfcd1"} Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.484246 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.484262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" event={"ID":"bd66f8b8-85ae-4b37-9852-5733ee2f87e2","Type":"ContainerStarted","Data":"10153dae436d323a00d89066c87b34e0736ef1d876ab74aa6c3783be9e7ae4cb"} Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.488266 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" Feb 18 05:51:56 crc kubenswrapper[4707]: I0218 05:51:56.521646 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-899f8988f-zlvlx" podStartSLOduration=2.5216175769999998 podStartE2EDuration="2.521617577s" podCreationTimestamp="2026-02-18 05:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:51:56.51930266 +0000 UTC m=+253.167261814" watchObservedRunningTime="2026-02-18 05:51:56.521617577 +0000 UTC m=+253.169576711" Feb 18 05:51:57 crc kubenswrapper[4707]: I0218 05:51:57.490036 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a253b57-9570-4ed9-8d7a-acb1733f9db2" containerID="638cbd9919fd1108c41c70a6e3560fddcfd713e8f401c17cd74309302c9d90c3" exitCode=0 Feb 18 05:51:57 crc kubenswrapper[4707]: I0218 05:51:57.490119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zprxl" event={"ID":"9a253b57-9570-4ed9-8d7a-acb1733f9db2","Type":"ContainerDied","Data":"638cbd9919fd1108c41c70a6e3560fddcfd713e8f401c17cd74309302c9d90c3"} Feb 18 05:51:57 crc kubenswrapper[4707]: I0218 05:51:57.499308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b26jp" event={"ID":"0f6f6f77-5e62-4186-bea4-19b13aa2a79a","Type":"ContainerStarted","Data":"f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b"} Feb 18 05:51:57 crc kubenswrapper[4707]: I0218 05:51:57.528561 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b26jp" podStartSLOduration=2.089390238 podStartE2EDuration="3.528539267s" podCreationTimestamp="2026-02-18 05:51:54 +0000 UTC" firstStartedPulling="2026-02-18 05:51:55.462017593 +0000 UTC m=+252.109976727" lastFinishedPulling="2026-02-18 05:51:56.901166622 +0000 UTC m=+253.549125756" observedRunningTime="2026-02-18 05:51:57.526625353 +0000 UTC m=+254.174584487" watchObservedRunningTime="2026-02-18 05:51:57.528539267 +0000 UTC m=+254.176498391" Feb 18 05:51:58 crc kubenswrapper[4707]: I0218 05:51:58.507360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zprxl" event={"ID":"9a253b57-9570-4ed9-8d7a-acb1733f9db2","Type":"ContainerStarted","Data":"326ea30d6c47e0eaf667ded7d482e9b4e07058919efa4032ee03a92fab1d3487"} Feb 18 05:52:01 crc kubenswrapper[4707]: I0218 05:52:01.751263 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:52:01 crc kubenswrapper[4707]: I0218 05:52:01.752006 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:52:01 crc kubenswrapper[4707]: I0218 05:52:01.797811 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:52:01 crc kubenswrapper[4707]: I0218 05:52:01.820673 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zprxl" podStartSLOduration=6.401887452 podStartE2EDuration="8.820655887s" podCreationTimestamp="2026-02-18 05:51:53 +0000 UTC" firstStartedPulling="2026-02-18 05:51:55.467180311 +0000 UTC m=+252.115139445" lastFinishedPulling="2026-02-18 05:51:57.885948746 +0000 UTC m=+254.533907880" observedRunningTime="2026-02-18 05:51:58.527490169 +0000 UTC m=+255.175449323" watchObservedRunningTime="2026-02-18 05:52:01.820655887 +0000 UTC m=+258.468615031" Feb 18 05:52:02 crc kubenswrapper[4707]: I0218 05:52:02.351291 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:52:02 crc kubenswrapper[4707]: I0218 05:52:02.351643 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:52:02 crc kubenswrapper[4707]: I0218 05:52:02.389552 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:52:02 crc kubenswrapper[4707]: I0218 05:52:02.582816 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rsz6w" Feb 18 05:52:02 crc kubenswrapper[4707]: I0218 05:52:02.583243 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kptw6" Feb 18 05:52:04 crc kubenswrapper[4707]: I0218 05:52:04.162493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:52:04 crc kubenswrapper[4707]: I0218 05:52:04.162929 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:52:04 crc kubenswrapper[4707]: I0218 05:52:04.206930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:52:04 crc kubenswrapper[4707]: I0218 05:52:04.590473 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zprxl" Feb 18 05:52:04 crc kubenswrapper[4707]: I0218 05:52:04.754660 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:52:04 crc kubenswrapper[4707]: I0218 05:52:04.754747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:52:04 crc kubenswrapper[4707]: I0218 05:52:04.791549 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:52:05 crc kubenswrapper[4707]: I0218 05:52:05.597228 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b26jp" Feb 18 05:52:07 crc kubenswrapper[4707]: I0218 05:52:07.851165 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" podUID="d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" containerName="registry" containerID="cri-o://ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148" gracePeriod=30 Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.309891 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.405611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-tls\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.405713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-bound-sa-token\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.405774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-ca-trust-extracted\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.405878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-certificates\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.405906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-trusted-ca\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.406115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.406143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mln25\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-kube-api-access-mln25\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.406179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-installation-pull-secrets\") pod \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\" (UID: \"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d\") " Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.406841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.406882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.413337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.413890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.414105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.421007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-kube-api-access-mln25" (OuterVolumeSpecName: "kube-api-access-mln25") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "kube-api-access-mln25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.438410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.439947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" (UID: "d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.507748 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.507787 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.507815 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.507824 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.507832 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.507842 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.507851 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mln25\" (UniqueName: \"kubernetes.io/projected/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d-kube-api-access-mln25\") on node \"crc\" DevicePath \"\"" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.590260 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" containerID="ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148" exitCode=0 Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.590324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.590330 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" event={"ID":"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d","Type":"ContainerDied","Data":"ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148"} Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.590455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r74s4" event={"ID":"d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d","Type":"ContainerDied","Data":"47e499d55fc1e344e023fdfd1c4cfe13fd68d5355d82f02ae80ddc871179d799"} Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.590478 4707 scope.go:117] "RemoveContainer" containerID="ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.606857 4707 scope.go:117] "RemoveContainer" containerID="ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148" Feb 18 05:52:10 crc kubenswrapper[4707]: E0218 05:52:10.607389 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148\": container with ID starting with ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148 not found: ID does not exist" containerID="ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.607454 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148"} err="failed to get container status \"ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148\": rpc error: code = NotFound desc = could not find container \"ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148\": container with ID starting with ce5418afa2e470dce4fd45aeecbeb232b87b1c4d938f49b992c7f6543caa0148 not found: ID does not exist" Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.633167 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r74s4"] Feb 18 05:52:10 crc kubenswrapper[4707]: I0218 05:52:10.638108 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r74s4"] Feb 18 05:52:12 crc kubenswrapper[4707]: I0218 05:52:12.059659 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" path="/var/lib/kubelet/pods/d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d/volumes" Feb 18 05:52:43 crc kubenswrapper[4707]: I0218 05:52:43.815613 4707 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 18 05:53:21 crc kubenswrapper[4707]: I0218 05:53:21.382449 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:53:21 crc kubenswrapper[4707]: I0218 05:53:21.383161 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:53:51 crc kubenswrapper[4707]: I0218 05:53:51.382524 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:53:51 crc kubenswrapper[4707]: I0218 05:53:51.383077 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:54:21 crc kubenswrapper[4707]: I0218 05:54:21.382779 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:54:21 crc kubenswrapper[4707]: I0218 05:54:21.383429 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:54:21 crc kubenswrapper[4707]: I0218 05:54:21.383504 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:54:21 crc kubenswrapper[4707]: I0218 05:54:21.384564 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c30aded239e5b6a5a3a43a43d3d8062408ccb46a9109591c1b4a41345a3dce40"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 05:54:21 crc kubenswrapper[4707]: I0218 05:54:21.384676 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://c30aded239e5b6a5a3a43a43d3d8062408ccb46a9109591c1b4a41345a3dce40" gracePeriod=600 Feb 18 05:54:22 crc kubenswrapper[4707]: I0218 05:54:22.333878 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="c30aded239e5b6a5a3a43a43d3d8062408ccb46a9109591c1b4a41345a3dce40" exitCode=0 Feb 18 05:54:22 crc kubenswrapper[4707]: I0218 05:54:22.334010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"c30aded239e5b6a5a3a43a43d3d8062408ccb46a9109591c1b4a41345a3dce40"} Feb 18 05:54:22 crc kubenswrapper[4707]: I0218 05:54:22.334240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"d49d6adeee4f1d9b81111c98288055c114a1e0649058be818dd7f0b90f16510a"} Feb 18 05:54:22 crc kubenswrapper[4707]: I0218 05:54:22.334267 4707 scope.go:117] "RemoveContainer" containerID="bb616e6ef6d668e0c27124102abb5c64f761976e7550e81d8eb8d94a07fb5fd4" Feb 18 05:56:21 crc kubenswrapper[4707]: I0218 05:56:21.382562 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:56:21 crc kubenswrapper[4707]: I0218 05:56:21.383149 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.720375 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts"] Feb 18 05:56:23 crc kubenswrapper[4707]: E0218 05:56:23.720627 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" containerName="registry" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.720642 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" containerName="registry" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.720723 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d5877c-6bdc-4837-a2e9-1f1a751e4e2d" containerName="registry" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.721107 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.722762 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vtwjs" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.726017 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.731386 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.749003 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fvczt"] Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.752240 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fvczt" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.759062 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pnq9x" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.759917 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts"] Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.766840 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dgt7d"] Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.767532 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.770216 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-xcwvw" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.773987 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fvczt"] Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.780640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dgt7d"] Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.794855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c84xd\" (UniqueName: \"kubernetes.io/projected/e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f-kube-api-access-c84xd\") pod \"cert-manager-cainjector-cf98fcc89-hr4ts\" (UID: \"e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.896495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c84xd\" (UniqueName: \"kubernetes.io/projected/e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f-kube-api-access-c84xd\") pod \"cert-manager-cainjector-cf98fcc89-hr4ts\" (UID: \"e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.896558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fghd8\" (UniqueName: \"kubernetes.io/projected/ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0-kube-api-access-fghd8\") pod \"cert-manager-858654f9db-fvczt\" (UID: \"ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0\") " pod="cert-manager/cert-manager-858654f9db-fvczt" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.896610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm82g\" (UniqueName: \"kubernetes.io/projected/07e55359-ea37-4736-a571-315823908633-kube-api-access-fm82g\") pod \"cert-manager-webhook-687f57d79b-dgt7d\" (UID: \"07e55359-ea37-4736-a571-315823908633\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.916156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c84xd\" (UniqueName: \"kubernetes.io/projected/e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f-kube-api-access-c84xd\") pod \"cert-manager-cainjector-cf98fcc89-hr4ts\" (UID: \"e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.998189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm82g\" (UniqueName: \"kubernetes.io/projected/07e55359-ea37-4736-a571-315823908633-kube-api-access-fm82g\") pod \"cert-manager-webhook-687f57d79b-dgt7d\" (UID: \"07e55359-ea37-4736-a571-315823908633\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" Feb 18 05:56:23 crc kubenswrapper[4707]: I0218 05:56:23.998559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fghd8\" (UniqueName: \"kubernetes.io/projected/ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0-kube-api-access-fghd8\") pod \"cert-manager-858654f9db-fvczt\" (UID: \"ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0\") " pod="cert-manager/cert-manager-858654f9db-fvczt" Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.014734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm82g\" (UniqueName: \"kubernetes.io/projected/07e55359-ea37-4736-a571-315823908633-kube-api-access-fm82g\") pod \"cert-manager-webhook-687f57d79b-dgt7d\" (UID: \"07e55359-ea37-4736-a571-315823908633\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.014832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fghd8\" (UniqueName: \"kubernetes.io/projected/ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0-kube-api-access-fghd8\") pod \"cert-manager-858654f9db-fvczt\" (UID: \"ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0\") " pod="cert-manager/cert-manager-858654f9db-fvczt" Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.051271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.078056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fvczt" Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.086926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.501468 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dgt7d"] Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.509284 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts"] Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.512469 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 05:56:24 crc kubenswrapper[4707]: I0218 05:56:24.549920 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fvczt"] Feb 18 05:56:24 crc kubenswrapper[4707]: W0218 05:56:24.553209 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca9b8372_6acf_4c51_8eaf_a0f0195ed4e0.slice/crio-384ee1c3bfee069a13d393131f957a685028a350b332adf855787d0ea13d4df0 WatchSource:0}: Error finding container 384ee1c3bfee069a13d393131f957a685028a350b332adf855787d0ea13d4df0: Status 404 returned error can't find the container with id 384ee1c3bfee069a13d393131f957a685028a350b332adf855787d0ea13d4df0 Feb 18 05:56:25 crc kubenswrapper[4707]: I0218 05:56:25.015906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" event={"ID":"e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f","Type":"ContainerStarted","Data":"6d01ee2b01dc73b93be73e71526a801d045f350069be99b3377accddb1ab02f0"} Feb 18 05:56:25 crc kubenswrapper[4707]: I0218 05:56:25.017400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" event={"ID":"07e55359-ea37-4736-a571-315823908633","Type":"ContainerStarted","Data":"4b08018aea83abe48d60a42d1ecf1ea04ba75b41bd73472fe5aff90c8a6dacb0"} Feb 18 05:56:25 crc kubenswrapper[4707]: I0218 05:56:25.018331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fvczt" event={"ID":"ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0","Type":"ContainerStarted","Data":"384ee1c3bfee069a13d393131f957a685028a350b332adf855787d0ea13d4df0"} Feb 18 05:56:30 crc kubenswrapper[4707]: I0218 05:56:30.046111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" event={"ID":"e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f","Type":"ContainerStarted","Data":"568c404545ff65381410b616e7c3453bb7ee2f034007c83e0aec9cdeb3fb3bbc"} Feb 18 05:56:30 crc kubenswrapper[4707]: I0218 05:56:30.047572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" event={"ID":"07e55359-ea37-4736-a571-315823908633","Type":"ContainerStarted","Data":"a998acf2f865b8c762b1cd00720b01a8819ddd3981800a48f7673115001c8479"} Feb 18 05:56:30 crc kubenswrapper[4707]: I0218 05:56:30.047701 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" Feb 18 05:56:30 crc kubenswrapper[4707]: I0218 05:56:30.048914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fvczt" event={"ID":"ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0","Type":"ContainerStarted","Data":"3e1c9ba624bafeca90d573214d2de47ab74a3333caf3ed790a89144423bc56e7"} Feb 18 05:56:30 crc kubenswrapper[4707]: I0218 05:56:30.071047 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hr4ts" podStartSLOduration=3.059456512 podStartE2EDuration="7.071023851s" podCreationTimestamp="2026-02-18 05:56:23 +0000 UTC" firstStartedPulling="2026-02-18 05:56:24.51228392 +0000 UTC m=+521.160243054" lastFinishedPulling="2026-02-18 05:56:28.523851259 +0000 UTC m=+525.171810393" observedRunningTime="2026-02-18 05:56:30.064071772 +0000 UTC m=+526.712030946" watchObservedRunningTime="2026-02-18 05:56:30.071023851 +0000 UTC m=+526.718983025" Feb 18 05:56:30 crc kubenswrapper[4707]: I0218 05:56:30.081865 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fvczt" podStartSLOduration=2.480520484 podStartE2EDuration="7.081844565s" podCreationTimestamp="2026-02-18 05:56:23 +0000 UTC" firstStartedPulling="2026-02-18 05:56:24.555083907 +0000 UTC m=+521.203043041" lastFinishedPulling="2026-02-18 05:56:29.156407978 +0000 UTC m=+525.804367122" observedRunningTime="2026-02-18 05:56:30.077770764 +0000 UTC m=+526.725729938" watchObservedRunningTime="2026-02-18 05:56:30.081844565 +0000 UTC m=+526.729803739" Feb 18 05:56:30 crc kubenswrapper[4707]: I0218 05:56:30.095495 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" podStartSLOduration=2.625302355 podStartE2EDuration="7.095475026s" podCreationTimestamp="2026-02-18 05:56:23 +0000 UTC" firstStartedPulling="2026-02-18 05:56:24.512317371 +0000 UTC m=+521.160276505" lastFinishedPulling="2026-02-18 05:56:28.982490042 +0000 UTC m=+525.630449176" observedRunningTime="2026-02-18 05:56:30.092904386 +0000 UTC m=+526.740863540" watchObservedRunningTime="2026-02-18 05:56:30.095475026 +0000 UTC m=+526.743434160" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.090274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-dgt7d" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.105141 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r5qsf"] Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.105966 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-controller" containerID="cri-o://ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.105996 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="northd" containerID="cri-o://5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.106026 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-node" containerID="cri-o://6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.106008 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="sbdb" containerID="cri-o://eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.106126 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.106114 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="nbdb" containerID="cri-o://b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.106184 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-acl-logging" containerID="cri-o://f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.143485 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovnkube-controller" containerID="cri-o://efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" gracePeriod=30 Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.847252 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r5qsf_2c00624a-9b7d-4593-821c-c76976b1c192/ovn-acl-logging/0.log" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.847925 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r5qsf_2c00624a-9b7d-4593-821c-c76976b1c192/ovn-controller/0.log" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.848285 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6v7n9"] Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904257 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovnkube-controller" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904285 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovnkube-controller" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904298 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="nbdb" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904306 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="nbdb" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904318 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904328 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904344 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="northd" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904351 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="northd" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904363 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kubecfg-setup" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904370 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kubecfg-setup" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904379 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="sbdb" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904386 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="sbdb" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904395 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-acl-logging" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904403 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-acl-logging" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904412 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-controller" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904419 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-controller" Feb 18 05:56:34 crc kubenswrapper[4707]: E0218 05:56:34.904430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-node" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904437 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-node" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904554 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovnkube-controller" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904567 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-controller" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904576 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="northd" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904588 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="ovn-acl-logging" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904602 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904613 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="nbdb" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904623 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="sbdb" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.904632 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" containerName="kube-rbac-proxy-node" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.906497 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-node-log\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c00624a-9b7d-4593-821c-c76976b1c192-ovn-node-metrics-cert\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4p6s\" (UniqueName: \"kubernetes.io/projected/2c00624a-9b7d-4593-821c-c76976b1c192-kube-api-access-z4p6s\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-env-overrides\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-kubelet\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-log-socket\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988393 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-ovn-kubernetes\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-ovn\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-openvswitch\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-script-lib\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-systemd\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-systemd-units\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-netd\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-netns\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-config\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-etc-openvswitch\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-bin\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-slash\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-var-lib-openvswitch\") pod \"2c00624a-9b7d-4593-821c-c76976b1c192\" (UID: \"2c00624a-9b7d-4593-821c-c76976b1c192\") " Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.988896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-node-log" (OuterVolumeSpecName: "node-log") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-log-socket" (OuterVolumeSpecName: "log-socket") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-slash" (OuterVolumeSpecName: "host-slash") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.989602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.993567 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c00624a-9b7d-4593-821c-c76976b1c192-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 05:56:34 crc kubenswrapper[4707]: I0218 05:56:34.994961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c00624a-9b7d-4593-821c-c76976b1c192-kube-api-access-z4p6s" (OuterVolumeSpecName: "kube-api-access-z4p6s") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "kube-api-access-z4p6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.002939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2c00624a-9b7d-4593-821c-c76976b1c192" (UID: "2c00624a-9b7d-4593-821c-c76976b1c192"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.083394 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r5qsf_2c00624a-9b7d-4593-821c-c76976b1c192/ovn-acl-logging/0.log" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084152 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r5qsf_2c00624a-9b7d-4593-821c-c76976b1c192/ovn-controller/0.log" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084694 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" exitCode=0 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084722 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" exitCode=0 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084732 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" exitCode=0 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084741 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" exitCode=0 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084751 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" exitCode=0 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084761 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" exitCode=0 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084769 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" exitCode=143 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084781 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c00624a-9b7d-4593-821c-c76976b1c192" containerID="ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" exitCode=143 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084894 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084961 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084974 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084982 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.084992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085003 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085012 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085020 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085028 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085036 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085043 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085050 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085057 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085064 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085089 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085097 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085104 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085111 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085118 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085125 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085132 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085139 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085146 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" event={"ID":"2c00624a-9b7d-4593-821c-c76976b1c192","Type":"ContainerDied","Data":"fbe7c2ce9136d28d74ce63feb2a6d11a4eab0aaee3a6d51f98a8653d1bf6c11f"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085168 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085176 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085184 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085191 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085197 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085205 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085212 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085219 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085227 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085243 4707 scope.go:117] "RemoveContainer" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.085374 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r5qsf" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.087197 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9b84_fc127100-df64-48e7-bed0-620c796dd6b0/kube-multus/0.log" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.087237 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc127100-df64-48e7-bed0-620c796dd6b0" containerID="c21be25ee4dcb50109caffd8a9e46273adf7cf635882c8c70a9d4012c55dbb17" exitCode=2 Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.087262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9b84" event={"ID":"fc127100-df64-48e7-bed0-620c796dd6b0","Type":"ContainerDied","Data":"c21be25ee4dcb50109caffd8a9e46273adf7cf635882c8c70a9d4012c55dbb17"} Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.087763 4707 scope.go:117] "RemoveContainer" containerID="c21be25ee4dcb50109caffd8a9e46273adf7cf635882c8c70a9d4012c55dbb17" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-systemd-units\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-etc-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk2sk\" (UniqueName: \"kubernetes.io/projected/939f3462-da28-4849-a4d8-6d28d455bfc9-kube-api-access-jk2sk\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/939f3462-da28-4849-a4d8-6d28d455bfc9-ovn-node-metrics-cert\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-kubelet\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-ovnkube-script-lib\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-node-log\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-log-socket\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-env-overrides\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-cni-netd\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-run-netns\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-cni-bin\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-slash\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.090866 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.091066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-ovnkube-config\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.091176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-var-lib-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.091220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-ovn\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.091272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-systemd\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.092938 4707 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-node-log\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.092994 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c00624a-9b7d-4593-821c-c76976b1c192-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093118 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4p6s\" (UniqueName: \"kubernetes.io/projected/2c00624a-9b7d-4593-821c-c76976b1c192-kube-api-access-z4p6s\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093163 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093187 4707 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093208 4707 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-log-socket\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093231 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093256 4707 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093281 4707 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093309 4707 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093332 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093355 4707 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093380 4707 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093402 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093426 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093451 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c00624a-9b7d-4593-821c-c76976b1c192-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093475 4707 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093498 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093520 4707 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-host-slash\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.093542 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c00624a-9b7d-4593-821c-c76976b1c192-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.101008 4707 scope.go:117] "RemoveContainer" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.117066 4707 scope.go:117] "RemoveContainer" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.133117 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r5qsf"] Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.137070 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r5qsf"] Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.137278 4707 scope.go:117] "RemoveContainer" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.158574 4707 scope.go:117] "RemoveContainer" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.175081 4707 scope.go:117] "RemoveContainer" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.190596 4707 scope.go:117] "RemoveContainer" containerID="f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.194740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/939f3462-da28-4849-a4d8-6d28d455bfc9-ovn-node-metrics-cert\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.194770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-kubelet\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.194793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-ovnkube-script-lib\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.195059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-node-log\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.194933 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-kubelet\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.196167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-log-socket\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.196776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-ovnkube-script-lib\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.200538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/939f3462-da28-4849-a4d8-6d28d455bfc9-ovn-node-metrics-cert\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.202921 4707 scope.go:117] "RemoveContainer" containerID="ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-log-socket\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-node-log\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-env-overrides\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-cni-netd\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-run-netns\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-cni-bin\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-slash\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-run-netns\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-ovnkube-config\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-var-lib-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-ovn\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-cni-netd\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-cni-bin\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-slash\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-var-lib-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-env-overrides\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.207891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-ovn\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-systemd\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-systemd-units\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-systemd-units\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-run-systemd\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-etc-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk2sk\" (UniqueName: \"kubernetes.io/projected/939f3462-da28-4849-a4d8-6d28d455bfc9-kube-api-access-jk2sk\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-etc-openvswitch\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/939f3462-da28-4849-a4d8-6d28d455bfc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.208500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/939f3462-da28-4849-a4d8-6d28d455bfc9-ovnkube-config\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.217637 4707 scope.go:117] "RemoveContainer" containerID="85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.225037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk2sk\" (UniqueName: \"kubernetes.io/projected/939f3462-da28-4849-a4d8-6d28d455bfc9-kube-api-access-jk2sk\") pod \"ovnkube-node-6v7n9\" (UID: \"939f3462-da28-4849-a4d8-6d28d455bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.265878 4707 scope.go:117] "RemoveContainer" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.267042 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": container with ID starting with efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3 not found: ID does not exist" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.267942 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} err="failed to get container status \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": rpc error: code = NotFound desc = could not find container \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": container with ID starting with efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.268404 4707 scope.go:117] "RemoveContainer" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.268784 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": container with ID starting with eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9 not found: ID does not exist" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.268832 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} err="failed to get container status \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": rpc error: code = NotFound desc = could not find container \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": container with ID starting with eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.268853 4707 scope.go:117] "RemoveContainer" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.269041 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": container with ID starting with b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06 not found: ID does not exist" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269061 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} err="failed to get container status \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": rpc error: code = NotFound desc = could not find container \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": container with ID starting with b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269073 4707 scope.go:117] "RemoveContainer" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.269245 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": container with ID starting with 5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2 not found: ID does not exist" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269262 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} err="failed to get container status \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": rpc error: code = NotFound desc = could not find container \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": container with ID starting with 5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269288 4707 scope.go:117] "RemoveContainer" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.269431 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": container with ID starting with b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358 not found: ID does not exist" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269450 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} err="failed to get container status \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": rpc error: code = NotFound desc = could not find container \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": container with ID starting with b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269461 4707 scope.go:117] "RemoveContainer" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.269603 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": container with ID starting with 6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c not found: ID does not exist" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269622 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} err="failed to get container status \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": rpc error: code = NotFound desc = could not find container \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": container with ID starting with 6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269635 4707 scope.go:117] "RemoveContainer" containerID="f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.269771 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": container with ID starting with f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a not found: ID does not exist" containerID="f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269788 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} err="failed to get container status \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": rpc error: code = NotFound desc = could not find container \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": container with ID starting with f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.269812 4707 scope.go:117] "RemoveContainer" containerID="ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.269980 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": container with ID starting with ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c not found: ID does not exist" containerID="ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270000 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} err="failed to get container status \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": rpc error: code = NotFound desc = could not find container \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": container with ID starting with ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270011 4707 scope.go:117] "RemoveContainer" containerID="85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b" Feb 18 05:56:35 crc kubenswrapper[4707]: E0218 05:56:35.270215 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": container with ID starting with 85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b not found: ID does not exist" containerID="85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270261 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} err="failed to get container status \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": rpc error: code = NotFound desc = could not find container \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": container with ID starting with 85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270291 4707 scope.go:117] "RemoveContainer" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270494 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} err="failed to get container status \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": rpc error: code = NotFound desc = could not find container \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": container with ID starting with efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270517 4707 scope.go:117] "RemoveContainer" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270676 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} err="failed to get container status \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": rpc error: code = NotFound desc = could not find container \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": container with ID starting with eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270693 4707 scope.go:117] "RemoveContainer" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270849 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} err="failed to get container status \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": rpc error: code = NotFound desc = could not find container \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": container with ID starting with b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.270870 4707 scope.go:117] "RemoveContainer" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271046 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} err="failed to get container status \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": rpc error: code = NotFound desc = could not find container \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": container with ID starting with 5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271086 4707 scope.go:117] "RemoveContainer" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271314 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} err="failed to get container status \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": rpc error: code = NotFound desc = could not find container \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": container with ID starting with b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271331 4707 scope.go:117] "RemoveContainer" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271522 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} err="failed to get container status \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": rpc error: code = NotFound desc = could not find container \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": container with ID starting with 6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271540 4707 scope.go:117] "RemoveContainer" containerID="f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271700 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} err="failed to get container status \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": rpc error: code = NotFound desc = could not find container \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": container with ID starting with f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271714 4707 scope.go:117] "RemoveContainer" containerID="ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271878 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} err="failed to get container status \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": rpc error: code = NotFound desc = could not find container \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": container with ID starting with ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.271892 4707 scope.go:117] "RemoveContainer" containerID="85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272041 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} err="failed to get container status \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": rpc error: code = NotFound desc = could not find container \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": container with ID starting with 85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272056 4707 scope.go:117] "RemoveContainer" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272192 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} err="failed to get container status \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": rpc error: code = NotFound desc = could not find container \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": container with ID starting with efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272205 4707 scope.go:117] "RemoveContainer" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272348 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} err="failed to get container status \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": rpc error: code = NotFound desc = could not find container \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": container with ID starting with eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272363 4707 scope.go:117] "RemoveContainer" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272507 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} err="failed to get container status \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": rpc error: code = NotFound desc = could not find container \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": container with ID starting with b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272523 4707 scope.go:117] "RemoveContainer" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272674 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} err="failed to get container status \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": rpc error: code = NotFound desc = could not find container \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": container with ID starting with 5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272688 4707 scope.go:117] "RemoveContainer" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272840 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} err="failed to get container status \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": rpc error: code = NotFound desc = could not find container \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": container with ID starting with b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.272855 4707 scope.go:117] "RemoveContainer" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273007 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} err="failed to get container status \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": rpc error: code = NotFound desc = could not find container \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": container with ID starting with 6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273027 4707 scope.go:117] "RemoveContainer" containerID="f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273183 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} err="failed to get container status \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": rpc error: code = NotFound desc = could not find container \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": container with ID starting with f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273198 4707 scope.go:117] "RemoveContainer" containerID="ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273332 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} err="failed to get container status \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": rpc error: code = NotFound desc = could not find container \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": container with ID starting with ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273346 4707 scope.go:117] "RemoveContainer" containerID="85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273495 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} err="failed to get container status \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": rpc error: code = NotFound desc = could not find container \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": container with ID starting with 85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273510 4707 scope.go:117] "RemoveContainer" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273663 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} err="failed to get container status \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": rpc error: code = NotFound desc = could not find container \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": container with ID starting with efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273679 4707 scope.go:117] "RemoveContainer" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273831 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} err="failed to get container status \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": rpc error: code = NotFound desc = could not find container \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": container with ID starting with eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.273845 4707 scope.go:117] "RemoveContainer" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274027 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} err="failed to get container status \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": rpc error: code = NotFound desc = could not find container \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": container with ID starting with b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274043 4707 scope.go:117] "RemoveContainer" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274179 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} err="failed to get container status \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": rpc error: code = NotFound desc = could not find container \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": container with ID starting with 5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274193 4707 scope.go:117] "RemoveContainer" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274326 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} err="failed to get container status \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": rpc error: code = NotFound desc = could not find container \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": container with ID starting with b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274344 4707 scope.go:117] "RemoveContainer" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274476 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} err="failed to get container status \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": rpc error: code = NotFound desc = could not find container \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": container with ID starting with 6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274491 4707 scope.go:117] "RemoveContainer" containerID="f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274664 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a"} err="failed to get container status \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": rpc error: code = NotFound desc = could not find container \"f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a\": container with ID starting with f0f747b37b81a4a181d6baf1e8b4dac0f07337fa9d5c2ecae8584916cdaac55a not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274679 4707 scope.go:117] "RemoveContainer" containerID="ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274828 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c"} err="failed to get container status \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": rpc error: code = NotFound desc = could not find container \"ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c\": container with ID starting with ebf257d9807bbb06120e34d6d04d5d862d9da3b4d48e252fe9ca1205488d890c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.274847 4707 scope.go:117] "RemoveContainer" containerID="85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275023 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b"} err="failed to get container status \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": rpc error: code = NotFound desc = could not find container \"85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b\": container with ID starting with 85fd7caf1efbeb77036f12095d429612fa77bafd4503387a54a0f74a6efd8d1b not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275039 4707 scope.go:117] "RemoveContainer" containerID="efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275178 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3"} err="failed to get container status \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": rpc error: code = NotFound desc = could not find container \"efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3\": container with ID starting with efe36d71722cbb730d262f363fa1689f045b4fdd31a76877c4229c9d2dca55c3 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275195 4707 scope.go:117] "RemoveContainer" containerID="eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275415 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9"} err="failed to get container status \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": rpc error: code = NotFound desc = could not find container \"eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9\": container with ID starting with eb7cfffb669e9d22ae97cf0c2721153d09dd73e1db67bdae5f38a07d036110c9 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275432 4707 scope.go:117] "RemoveContainer" containerID="b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275590 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06"} err="failed to get container status \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": rpc error: code = NotFound desc = could not find container \"b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06\": container with ID starting with b25b014716852998d3174c5b415e7c6ea96f96860effb852fbc90dc8199c1e06 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275609 4707 scope.go:117] "RemoveContainer" containerID="5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275763 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2"} err="failed to get container status \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": rpc error: code = NotFound desc = could not find container \"5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2\": container with ID starting with 5e926bb1dcab8b1fd2bc0869a44b58362d8df802d5d45e123e3ad0795c993ef2 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275780 4707 scope.go:117] "RemoveContainer" containerID="b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275951 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358"} err="failed to get container status \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": rpc error: code = NotFound desc = could not find container \"b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358\": container with ID starting with b781e9379f8d6a7e2fe4fcaef9a10bdf79165d54167d72f86dc8c66ffcead358 not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.275966 4707 scope.go:117] "RemoveContainer" containerID="6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.276148 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c"} err="failed to get container status \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": rpc error: code = NotFound desc = could not find container \"6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c\": container with ID starting with 6a50b106141f69c211fc6fd956d662efe9c92da184149262b25198a8eb5f334c not found: ID does not exist" Feb 18 05:56:35 crc kubenswrapper[4707]: I0218 05:56:35.520256 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:35 crc kubenswrapper[4707]: W0218 05:56:35.544707 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939f3462_da28_4849_a4d8_6d28d455bfc9.slice/crio-9b2286abf51fdd146115878d5c7d363a3c56b1b5f4e7d30deda13e61903336a7 WatchSource:0}: Error finding container 9b2286abf51fdd146115878d5c7d363a3c56b1b5f4e7d30deda13e61903336a7: Status 404 returned error can't find the container with id 9b2286abf51fdd146115878d5c7d363a3c56b1b5f4e7d30deda13e61903336a7 Feb 18 05:56:36 crc kubenswrapper[4707]: I0218 05:56:36.063675 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c00624a-9b7d-4593-821c-c76976b1c192" path="/var/lib/kubelet/pods/2c00624a-9b7d-4593-821c-c76976b1c192/volumes" Feb 18 05:56:36 crc kubenswrapper[4707]: I0218 05:56:36.094535 4707 generic.go:334] "Generic (PLEG): container finished" podID="939f3462-da28-4849-a4d8-6d28d455bfc9" containerID="642ce3026eeef69e6782a09cceef140e3504561109406ca4eb1f323df7d41a94" exitCode=0 Feb 18 05:56:36 crc kubenswrapper[4707]: I0218 05:56:36.094583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerDied","Data":"642ce3026eeef69e6782a09cceef140e3504561109406ca4eb1f323df7d41a94"} Feb 18 05:56:36 crc kubenswrapper[4707]: I0218 05:56:36.094604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"9b2286abf51fdd146115878d5c7d363a3c56b1b5f4e7d30deda13e61903336a7"} Feb 18 05:56:36 crc kubenswrapper[4707]: I0218 05:56:36.097281 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9b84_fc127100-df64-48e7-bed0-620c796dd6b0/kube-multus/0.log" Feb 18 05:56:36 crc kubenswrapper[4707]: I0218 05:56:36.097327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9b84" event={"ID":"fc127100-df64-48e7-bed0-620c796dd6b0","Type":"ContainerStarted","Data":"a4494976e2cd5fc98139bbfdf6da5f46386666ab43fe5d4cf9946e1220429de1"} Feb 18 05:56:37 crc kubenswrapper[4707]: I0218 05:56:37.113032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"c2cdf42882a66e50e2681a465c097c72f11adcf011a2664fa440fed65d19c811"} Feb 18 05:56:37 crc kubenswrapper[4707]: I0218 05:56:37.113516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"c599aad19c7d2273ff51aa82607a7278c10342c09b35f0af1cbef42b41e9165a"} Feb 18 05:56:37 crc kubenswrapper[4707]: I0218 05:56:37.113531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"a068dc72de0c4d2d91e76403132259735eb3d420a34b193276caf1d8808da4f8"} Feb 18 05:56:37 crc kubenswrapper[4707]: I0218 05:56:37.113562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"2d7c7552c65b23d8e9f7d7b9be0f73b5fe67c6bea35c15385e8a5bef9360f526"} Feb 18 05:56:37 crc kubenswrapper[4707]: I0218 05:56:37.113577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"d3ac6a93a11716dae0c3005abed2dc4428c2f7a9d7943b1ee1d7ed33a1bbcb06"} Feb 18 05:56:37 crc kubenswrapper[4707]: I0218 05:56:37.113593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"54e5f6637e22287091930acb2ab23dff4fd81c3ed18a65abf435a87e07c48e2f"} Feb 18 05:56:39 crc kubenswrapper[4707]: I0218 05:56:39.127099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"fa1f5b24be7d0fe37e63c8b07e3ed1b67064624ffc59def2a7f02a5aed89d1ad"} Feb 18 05:56:42 crc kubenswrapper[4707]: I0218 05:56:42.144150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" event={"ID":"939f3462-da28-4849-a4d8-6d28d455bfc9","Type":"ContainerStarted","Data":"495b111aede1e46a26046a39b95005e4759c0cea4ae8d0bba550dfdd2bfdd1ec"} Feb 18 05:56:42 crc kubenswrapper[4707]: I0218 05:56:42.144560 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:42 crc kubenswrapper[4707]: I0218 05:56:42.144608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:42 crc kubenswrapper[4707]: I0218 05:56:42.144657 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:42 crc kubenswrapper[4707]: I0218 05:56:42.193844 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:42 crc kubenswrapper[4707]: I0218 05:56:42.204218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:56:42 crc kubenswrapper[4707]: I0218 05:56:42.228460 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" podStartSLOduration=8.22844329 podStartE2EDuration="8.22844329s" podCreationTimestamp="2026-02-18 05:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:56:42.172091019 +0000 UTC m=+538.820050173" watchObservedRunningTime="2026-02-18 05:56:42.22844329 +0000 UTC m=+538.876402424" Feb 18 05:56:51 crc kubenswrapper[4707]: I0218 05:56:51.382897 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:56:51 crc kubenswrapper[4707]: I0218 05:56:51.383434 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.645847 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.647745 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.649970 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.650162 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rwxlm" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.651588 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.785235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-run\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.785593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-data\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.785734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-log\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.785877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t47bh\" (UniqueName: \"kubernetes.io/projected/1b24c097-807d-43e6-aaa5-b9abfb48bff5-kube-api-access-t47bh\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.887330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-log\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.887652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t47bh\" (UniqueName: \"kubernetes.io/projected/1b24c097-807d-43e6-aaa5-b9abfb48bff5-kube-api-access-t47bh\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.887952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-run\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.888160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-data\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.888331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-log\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.888482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-run\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.889254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b24c097-807d-43e6-aaa5-b9abfb48bff5-data\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.914000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t47bh\" (UniqueName: \"kubernetes.io/projected/1b24c097-807d-43e6-aaa5-b9abfb48bff5-kube-api-access-t47bh\") pod \"ceph\" (UID: \"1b24c097-807d-43e6-aaa5-b9abfb48bff5\") " pod="openstack/ceph" Feb 18 05:56:57 crc kubenswrapper[4707]: I0218 05:56:57.976826 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 18 05:56:58 crc kubenswrapper[4707]: W0218 05:56:58.010945 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b24c097_807d_43e6_aaa5_b9abfb48bff5.slice/crio-66e563375a8e65e62e0923659026f55dde742bb9daf19a5d2eb7267e9c5f0260 WatchSource:0}: Error finding container 66e563375a8e65e62e0923659026f55dde742bb9daf19a5d2eb7267e9c5f0260: Status 404 returned error can't find the container with id 66e563375a8e65e62e0923659026f55dde742bb9daf19a5d2eb7267e9c5f0260 Feb 18 05:56:58 crc kubenswrapper[4707]: E0218 05:56:58.028302 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:56:58 crc kubenswrapper[4707]: E0218 05:56:58.046518 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:56:58 crc kubenswrapper[4707]: I0218 05:56:58.235247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"1b24c097-807d-43e6-aaa5-b9abfb48bff5","Type":"ContainerStarted","Data":"66e563375a8e65e62e0923659026f55dde742bb9daf19a5d2eb7267e9c5f0260"} Feb 18 05:56:59 crc kubenswrapper[4707]: E0218 05:56:59.249024 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:56:59 crc kubenswrapper[4707]: E0218 05:56:59.261977 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:00 crc kubenswrapper[4707]: E0218 05:57:00.477115 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:00 crc kubenswrapper[4707]: E0218 05:57:00.490554 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:01 crc kubenswrapper[4707]: E0218 05:57:01.713662 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:01 crc kubenswrapper[4707]: E0218 05:57:01.727525 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:02 crc kubenswrapper[4707]: E0218 05:57:02.915244 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:02 crc kubenswrapper[4707]: E0218 05:57:02.929880 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:04 crc kubenswrapper[4707]: E0218 05:57:04.053922 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:04 crc kubenswrapper[4707]: E0218 05:57:04.066889 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:05 crc kubenswrapper[4707]: E0218 05:57:05.250331 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:05 crc kubenswrapper[4707]: E0218 05:57:05.263821 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:05 crc kubenswrapper[4707]: I0218 05:57:05.539134 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6v7n9" Feb 18 05:57:06 crc kubenswrapper[4707]: E0218 05:57:06.387418 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:06 crc kubenswrapper[4707]: E0218 05:57:06.400531 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:07 crc kubenswrapper[4707]: E0218 05:57:07.605443 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:07 crc kubenswrapper[4707]: E0218 05:57:07.624758 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:08 crc kubenswrapper[4707]: E0218 05:57:08.820032 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:08 crc kubenswrapper[4707]: E0218 05:57:08.834426 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:09 crc kubenswrapper[4707]: E0218 05:57:09.976654 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:09 crc kubenswrapper[4707]: E0218 05:57:09.989723 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:11 crc kubenswrapper[4707]: E0218 05:57:11.148534 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:11 crc kubenswrapper[4707]: E0218 05:57:11.163149 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:12 crc kubenswrapper[4707]: E0218 05:57:12.322358 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:12 crc kubenswrapper[4707]: E0218 05:57:12.336708 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:13 crc kubenswrapper[4707]: E0218 05:57:13.528367 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:13 crc kubenswrapper[4707]: E0218 05:57:13.545774 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:14 crc kubenswrapper[4707]: I0218 05:57:14.336212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"1b24c097-807d-43e6-aaa5-b9abfb48bff5","Type":"ContainerStarted","Data":"567feb35ff7627e83378c7459a5800693cf50757440615d5f01b8080a371fd62"} Feb 18 05:57:14 crc kubenswrapper[4707]: I0218 05:57:14.353075 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=1.989201408 podStartE2EDuration="17.353045787s" podCreationTimestamp="2026-02-18 05:56:57 +0000 UTC" firstStartedPulling="2026-02-18 05:56:58.018366567 +0000 UTC m=+554.666325751" lastFinishedPulling="2026-02-18 05:57:13.382210996 +0000 UTC m=+570.030170130" observedRunningTime="2026-02-18 05:57:14.349030798 +0000 UTC m=+570.996989932" watchObservedRunningTime="2026-02-18 05:57:14.353045787 +0000 UTC m=+571.001004941" Feb 18 05:57:14 crc kubenswrapper[4707]: E0218 05:57:14.688731 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:14 crc kubenswrapper[4707]: E0218 05:57:14.707738 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:15 crc kubenswrapper[4707]: E0218 05:57:15.915225 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:15 crc kubenswrapper[4707]: E0218 05:57:15.935149 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:17 crc kubenswrapper[4707]: E0218 05:57:17.089783 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:17 crc kubenswrapper[4707]: E0218 05:57:17.101675 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:18 crc kubenswrapper[4707]: E0218 05:57:18.282437 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:18 crc kubenswrapper[4707]: E0218 05:57:18.305538 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:19 crc kubenswrapper[4707]: E0218 05:57:19.510346 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:19 crc kubenswrapper[4707]: E0218 05:57:19.529161 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:20 crc kubenswrapper[4707]: E0218 05:57:20.718947 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:20 crc kubenswrapper[4707]: E0218 05:57:20.745274 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:21 crc kubenswrapper[4707]: I0218 05:57:21.382503 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:57:21 crc kubenswrapper[4707]: I0218 05:57:21.382576 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:57:21 crc kubenswrapper[4707]: I0218 05:57:21.382625 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 05:57:21 crc kubenswrapper[4707]: I0218 05:57:21.383351 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d49d6adeee4f1d9b81111c98288055c114a1e0649058be818dd7f0b90f16510a"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 05:57:21 crc kubenswrapper[4707]: I0218 05:57:21.383431 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://d49d6adeee4f1d9b81111c98288055c114a1e0649058be818dd7f0b90f16510a" gracePeriod=600 Feb 18 05:57:21 crc kubenswrapper[4707]: E0218 05:57:21.958903 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:21 crc kubenswrapper[4707]: E0218 05:57:21.971110 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:22 crc kubenswrapper[4707]: I0218 05:57:22.388777 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="d49d6adeee4f1d9b81111c98288055c114a1e0649058be818dd7f0b90f16510a" exitCode=0 Feb 18 05:57:22 crc kubenswrapper[4707]: I0218 05:57:22.388835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"d49d6adeee4f1d9b81111c98288055c114a1e0649058be818dd7f0b90f16510a"} Feb 18 05:57:22 crc kubenswrapper[4707]: I0218 05:57:22.388902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"e32285faa40540ae5e3f1a855f8e56122b182016a9d9b345bafd252cfac4ced1"} Feb 18 05:57:22 crc kubenswrapper[4707]: I0218 05:57:22.388928 4707 scope.go:117] "RemoveContainer" containerID="c30aded239e5b6a5a3a43a43d3d8062408ccb46a9109591c1b4a41345a3dce40" Feb 18 05:57:23 crc kubenswrapper[4707]: E0218 05:57:23.170052 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:23 crc kubenswrapper[4707]: E0218 05:57:23.182720 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:24 crc kubenswrapper[4707]: E0218 05:57:24.360920 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:24 crc kubenswrapper[4707]: E0218 05:57:24.380997 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:25 crc kubenswrapper[4707]: E0218 05:57:25.560964 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:25 crc kubenswrapper[4707]: E0218 05:57:25.575495 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:26 crc kubenswrapper[4707]: E0218 05:57:26.763357 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:26 crc kubenswrapper[4707]: E0218 05:57:26.779192 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:27 crc kubenswrapper[4707]: E0218 05:57:27.974079 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:27 crc kubenswrapper[4707]: E0218 05:57:27.991952 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:29 crc kubenswrapper[4707]: E0218 05:57:29.143214 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:29 crc kubenswrapper[4707]: E0218 05:57:29.156577 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:30 crc kubenswrapper[4707]: E0218 05:57:30.294455 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:30 crc kubenswrapper[4707]: E0218 05:57:30.311320 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:31 crc kubenswrapper[4707]: E0218 05:57:31.488596 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:31 crc kubenswrapper[4707]: E0218 05:57:31.506575 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:32 crc kubenswrapper[4707]: E0218 05:57:32.686891 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:32 crc kubenswrapper[4707]: E0218 05:57:32.701657 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:33 crc kubenswrapper[4707]: E0218 05:57:33.923658 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:33 crc kubenswrapper[4707]: E0218 05:57:33.937571 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:35 crc kubenswrapper[4707]: E0218 05:57:35.077143 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:35 crc kubenswrapper[4707]: E0218 05:57:35.094681 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:36 crc kubenswrapper[4707]: E0218 05:57:36.247892 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:36 crc kubenswrapper[4707]: E0218 05:57:36.261061 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:37 crc kubenswrapper[4707]: E0218 05:57:37.485699 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:37 crc kubenswrapper[4707]: E0218 05:57:37.508158 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:38 crc kubenswrapper[4707]: E0218 05:57:38.674618 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:38 crc kubenswrapper[4707]: E0218 05:57:38.693757 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:39 crc kubenswrapper[4707]: E0218 05:57:39.865558 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:39 crc kubenswrapper[4707]: E0218 05:57:39.883311 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:41 crc kubenswrapper[4707]: E0218 05:57:41.111271 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:41 crc kubenswrapper[4707]: E0218 05:57:41.127522 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:42 crc kubenswrapper[4707]: E0218 05:57:42.336014 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:42 crc kubenswrapper[4707]: E0218 05:57:42.352917 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:43 crc kubenswrapper[4707]: E0218 05:57:43.558468 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:43 crc kubenswrapper[4707]: E0218 05:57:43.580610 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:44 crc kubenswrapper[4707]: E0218 05:57:44.743768 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:44 crc kubenswrapper[4707]: E0218 05:57:44.758314 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:45 crc kubenswrapper[4707]: E0218 05:57:45.986078 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:46 crc kubenswrapper[4707]: E0218 05:57:46.002407 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:47 crc kubenswrapper[4707]: E0218 05:57:47.171230 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:47 crc kubenswrapper[4707]: E0218 05:57:47.185210 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:48 crc kubenswrapper[4707]: E0218 05:57:48.355449 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:48 crc kubenswrapper[4707]: E0218 05:57:48.374939 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:49 crc kubenswrapper[4707]: E0218 05:57:49.587751 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:49 crc kubenswrapper[4707]: E0218 05:57:49.609343 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:50 crc kubenswrapper[4707]: E0218 05:57:50.764034 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:50 crc kubenswrapper[4707]: E0218 05:57:50.778323 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:51 crc kubenswrapper[4707]: E0218 05:57:51.961423 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:51 crc kubenswrapper[4707]: E0218 05:57:51.976598 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:53 crc kubenswrapper[4707]: E0218 05:57:53.158137 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:53 crc kubenswrapper[4707]: E0218 05:57:53.204006 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:54 crc kubenswrapper[4707]: E0218 05:57:54.371163 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:54 crc kubenswrapper[4707]: E0218 05:57:54.386088 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:55 crc kubenswrapper[4707]: E0218 05:57:55.618231 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:55 crc kubenswrapper[4707]: E0218 05:57:55.631992 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:56 crc kubenswrapper[4707]: E0218 05:57:56.781140 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:56 crc kubenswrapper[4707]: E0218 05:57:56.797270 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:57 crc kubenswrapper[4707]: E0218 05:57:57.995778 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:58 crc kubenswrapper[4707]: E0218 05:57:58.017012 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:59 crc kubenswrapper[4707]: E0218 05:57:59.198922 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:57:59 crc kubenswrapper[4707]: E0218 05:57:59.217740 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:00 crc kubenswrapper[4707]: E0218 05:58:00.365127 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:00 crc kubenswrapper[4707]: E0218 05:58:00.379697 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:01 crc kubenswrapper[4707]: E0218 05:58:01.512266 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:01 crc kubenswrapper[4707]: E0218 05:58:01.533480 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:02 crc kubenswrapper[4707]: E0218 05:58:02.696373 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:02 crc kubenswrapper[4707]: E0218 05:58:02.708461 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:03 crc kubenswrapper[4707]: E0218 05:58:03.859525 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:03 crc kubenswrapper[4707]: E0218 05:58:03.874385 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:05 crc kubenswrapper[4707]: E0218 05:58:05.093172 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:05 crc kubenswrapper[4707]: E0218 05:58:05.107721 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:06 crc kubenswrapper[4707]: E0218 05:58:06.267325 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:06 crc kubenswrapper[4707]: E0218 05:58:06.286184 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:07 crc kubenswrapper[4707]: E0218 05:58:07.455354 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:07 crc kubenswrapper[4707]: E0218 05:58:07.468863 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:08 crc kubenswrapper[4707]: E0218 05:58:08.704425 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:08 crc kubenswrapper[4707]: E0218 05:58:08.724106 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:09 crc kubenswrapper[4707]: E0218 05:58:09.855087 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:09 crc kubenswrapper[4707]: E0218 05:58:09.874946 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:11 crc kubenswrapper[4707]: E0218 05:58:11.072859 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:11 crc kubenswrapper[4707]: E0218 05:58:11.092175 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:12 crc kubenswrapper[4707]: E0218 05:58:12.271936 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:12 crc kubenswrapper[4707]: E0218 05:58:12.288323 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:13 crc kubenswrapper[4707]: E0218 05:58:13.417677 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:13 crc kubenswrapper[4707]: E0218 05:58:13.434651 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:14 crc kubenswrapper[4707]: E0218 05:58:14.654693 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:14 crc kubenswrapper[4707]: E0218 05:58:14.667717 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:15 crc kubenswrapper[4707]: E0218 05:58:15.849663 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:15 crc kubenswrapper[4707]: E0218 05:58:15.868656 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:17 crc kubenswrapper[4707]: E0218 05:58:17.088587 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:17 crc kubenswrapper[4707]: E0218 05:58:17.107875 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:18 crc kubenswrapper[4707]: E0218 05:58:18.326268 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:18 crc kubenswrapper[4707]: E0218 05:58:18.341595 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:19 crc kubenswrapper[4707]: E0218 05:58:19.578110 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:19 crc kubenswrapper[4707]: E0218 05:58:19.590391 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:20 crc kubenswrapper[4707]: E0218 05:58:20.754465 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:20 crc kubenswrapper[4707]: E0218 05:58:20.768962 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:22 crc kubenswrapper[4707]: E0218 05:58:22.010199 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:22 crc kubenswrapper[4707]: E0218 05:58:22.026467 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:23 crc kubenswrapper[4707]: E0218 05:58:23.243564 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:23 crc kubenswrapper[4707]: E0218 05:58:23.259984 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:24 crc kubenswrapper[4707]: E0218 05:58:24.429241 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:24 crc kubenswrapper[4707]: E0218 05:58:24.441997 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:25 crc kubenswrapper[4707]: E0218 05:58:25.648976 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:25 crc kubenswrapper[4707]: E0218 05:58:25.667890 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:26 crc kubenswrapper[4707]: E0218 05:58:26.881888 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:26 crc kubenswrapper[4707]: E0218 05:58:26.893733 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:28 crc kubenswrapper[4707]: E0218 05:58:28.111046 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:28 crc kubenswrapper[4707]: E0218 05:58:28.126070 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:29 crc kubenswrapper[4707]: E0218 05:58:29.292221 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:29 crc kubenswrapper[4707]: E0218 05:58:29.312320 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:30 crc kubenswrapper[4707]: E0218 05:58:30.451313 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:30 crc kubenswrapper[4707]: E0218 05:58:30.469153 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:31 crc kubenswrapper[4707]: E0218 05:58:31.655228 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:31 crc kubenswrapper[4707]: E0218 05:58:31.680060 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:32 crc kubenswrapper[4707]: E0218 05:58:32.879166 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:32 crc kubenswrapper[4707]: E0218 05:58:32.893329 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:34 crc kubenswrapper[4707]: E0218 05:58:34.116481 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:34 crc kubenswrapper[4707]: E0218 05:58:34.135136 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:35 crc kubenswrapper[4707]: E0218 05:58:35.308681 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:35 crc kubenswrapper[4707]: E0218 05:58:35.324230 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:36 crc kubenswrapper[4707]: E0218 05:58:36.491669 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:36 crc kubenswrapper[4707]: E0218 05:58:36.505334 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:37 crc kubenswrapper[4707]: E0218 05:58:37.671118 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:37 crc kubenswrapper[4707]: E0218 05:58:37.684701 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:38 crc kubenswrapper[4707]: E0218 05:58:38.881914 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:38 crc kubenswrapper[4707]: E0218 05:58:38.902119 4707 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=7734718397808596247, SKID=, AKID=B0:4D:46:C4:D9:1A:FB:1E:CF:DF:D2:ED:8B:A8:E7:7D:56:A0:64:7A failed: x509: certificate signed by unknown authority" Feb 18 05:58:39 crc kubenswrapper[4707]: I0218 05:58:39.267000 4707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 18 05:58:48 crc kubenswrapper[4707]: I0218 05:58:48.933373 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pxgc2"] Feb 18 05:58:48 crc kubenswrapper[4707]: I0218 05:58:48.940334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:48 crc kubenswrapper[4707]: I0218 05:58:48.944002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxgc2"] Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.131266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srds2\" (UniqueName: \"kubernetes.io/projected/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-kube-api-access-srds2\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.131330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-catalog-content\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.131576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-utilities\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.232763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-utilities\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.233179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srds2\" (UniqueName: \"kubernetes.io/projected/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-kube-api-access-srds2\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.233200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-catalog-content\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.233291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-utilities\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.233635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-catalog-content\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.261675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srds2\" (UniqueName: \"kubernetes.io/projected/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-kube-api-access-srds2\") pod \"redhat-marketplace-pxgc2\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.270380 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.537247 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxgc2"] Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.943730 4707 generic.go:334] "Generic (PLEG): container finished" podID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerID="3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a" exitCode=0 Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.943772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxgc2" event={"ID":"0086369d-ae40-4c4d-be3d-c3e3f06c93f4","Type":"ContainerDied","Data":"3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a"} Feb 18 05:58:49 crc kubenswrapper[4707]: I0218 05:58:49.943818 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxgc2" event={"ID":"0086369d-ae40-4c4d-be3d-c3e3f06c93f4","Type":"ContainerStarted","Data":"b78d14f2dd33209ba30a192999085125f9b8b5706ae81d4127896d5f489fa148"} Feb 18 05:58:50 crc kubenswrapper[4707]: I0218 05:58:50.951734 4707 generic.go:334] "Generic (PLEG): container finished" podID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerID="3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda" exitCode=0 Feb 18 05:58:50 crc kubenswrapper[4707]: I0218 05:58:50.952116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxgc2" event={"ID":"0086369d-ae40-4c4d-be3d-c3e3f06c93f4","Type":"ContainerDied","Data":"3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda"} Feb 18 05:58:51 crc kubenswrapper[4707]: I0218 05:58:51.959159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxgc2" event={"ID":"0086369d-ae40-4c4d-be3d-c3e3f06c93f4","Type":"ContainerStarted","Data":"c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b"} Feb 18 05:58:51 crc kubenswrapper[4707]: I0218 05:58:51.989460 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pxgc2" podStartSLOduration=2.556086816 podStartE2EDuration="3.98944338s" podCreationTimestamp="2026-02-18 05:58:48 +0000 UTC" firstStartedPulling="2026-02-18 05:58:49.944909757 +0000 UTC m=+666.592868881" lastFinishedPulling="2026-02-18 05:58:51.378266301 +0000 UTC m=+668.026225445" observedRunningTime="2026-02-18 05:58:51.987422625 +0000 UTC m=+668.635381819" watchObservedRunningTime="2026-02-18 05:58:51.98944338 +0000 UTC m=+668.637402514" Feb 18 05:58:59 crc kubenswrapper[4707]: I0218 05:58:59.271770 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:59 crc kubenswrapper[4707]: I0218 05:58:59.272611 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:58:59 crc kubenswrapper[4707]: I0218 05:58:59.341226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:59:00 crc kubenswrapper[4707]: I0218 05:59:00.064690 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:59:00 crc kubenswrapper[4707]: I0218 05:59:00.104341 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxgc2"] Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.017385 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pxgc2" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="registry-server" containerID="cri-o://c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b" gracePeriod=2 Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.399994 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.600753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srds2\" (UniqueName: \"kubernetes.io/projected/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-kube-api-access-srds2\") pod \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.600865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-catalog-content\") pod \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.600930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-utilities\") pod \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\" (UID: \"0086369d-ae40-4c4d-be3d-c3e3f06c93f4\") " Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.601783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-utilities" (OuterVolumeSpecName: "utilities") pod "0086369d-ae40-4c4d-be3d-c3e3f06c93f4" (UID: "0086369d-ae40-4c4d-be3d-c3e3f06c93f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.615135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-kube-api-access-srds2" (OuterVolumeSpecName: "kube-api-access-srds2") pod "0086369d-ae40-4c4d-be3d-c3e3f06c93f4" (UID: "0086369d-ae40-4c4d-be3d-c3e3f06c93f4"). InnerVolumeSpecName "kube-api-access-srds2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.621488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0086369d-ae40-4c4d-be3d-c3e3f06c93f4" (UID: "0086369d-ae40-4c4d-be3d-c3e3f06c93f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.702433 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srds2\" (UniqueName: \"kubernetes.io/projected/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-kube-api-access-srds2\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.702472 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.702484 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0086369d-ae40-4c4d-be3d-c3e3f06c93f4-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.988240 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6q5x7"] Feb 18 05:59:02 crc kubenswrapper[4707]: E0218 05:59:02.989083 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="extract-utilities" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.989108 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="extract-utilities" Feb 18 05:59:02 crc kubenswrapper[4707]: E0218 05:59:02.989166 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="registry-server" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.989176 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="registry-server" Feb 18 05:59:02 crc kubenswrapper[4707]: E0218 05:59:02.989194 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="extract-content" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.989202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="extract-content" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.989443 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerName="registry-server" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.990556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:02 crc kubenswrapper[4707]: I0218 05:59:02.997395 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q5x7"] Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.006046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7dn\" (UniqueName: \"kubernetes.io/projected/49418ce7-ae04-432a-8e14-4603b03dc5ef-kube-api-access-mn7dn\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.006140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-utilities\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.006227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-catalog-content\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.025976 4707 generic.go:334] "Generic (PLEG): container finished" podID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" containerID="c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b" exitCode=0 Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.026026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxgc2" event={"ID":"0086369d-ae40-4c4d-be3d-c3e3f06c93f4","Type":"ContainerDied","Data":"c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b"} Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.026057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxgc2" event={"ID":"0086369d-ae40-4c4d-be3d-c3e3f06c93f4","Type":"ContainerDied","Data":"b78d14f2dd33209ba30a192999085125f9b8b5706ae81d4127896d5f489fa148"} Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.026079 4707 scope.go:117] "RemoveContainer" containerID="c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.026239 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxgc2" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.047832 4707 scope.go:117] "RemoveContainer" containerID="3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.059214 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxgc2"] Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.069759 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxgc2"] Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.070062 4707 scope.go:117] "RemoveContainer" containerID="3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.087545 4707 scope.go:117] "RemoveContainer" containerID="c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b" Feb 18 05:59:03 crc kubenswrapper[4707]: E0218 05:59:03.088427 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b\": container with ID starting with c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b not found: ID does not exist" containerID="c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.088482 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b"} err="failed to get container status \"c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b\": rpc error: code = NotFound desc = could not find container \"c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b\": container with ID starting with c22ac4358d27e2ccf0df92f0a1615341f81d47d2abfe482477c5b2b65c169f6b not found: ID does not exist" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.088516 4707 scope.go:117] "RemoveContainer" containerID="3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda" Feb 18 05:59:03 crc kubenswrapper[4707]: E0218 05:59:03.088950 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda\": container with ID starting with 3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda not found: ID does not exist" containerID="3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.088993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda"} err="failed to get container status \"3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda\": rpc error: code = NotFound desc = could not find container \"3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda\": container with ID starting with 3c9be557a51a58b4fda1f5382bfd51fdd07202ba4caecd409be64df837cc8bda not found: ID does not exist" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.089029 4707 scope.go:117] "RemoveContainer" containerID="3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a" Feb 18 05:59:03 crc kubenswrapper[4707]: E0218 05:59:03.089351 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a\": container with ID starting with 3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a not found: ID does not exist" containerID="3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.089383 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a"} err="failed to get container status \"3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a\": rpc error: code = NotFound desc = could not find container \"3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a\": container with ID starting with 3fcb7f934dc74b835d6a9c1412056de99115a81d5f72ca25b064e22931c3c24a not found: ID does not exist" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.107286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-utilities\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.107371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-catalog-content\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.107423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7dn\" (UniqueName: \"kubernetes.io/projected/49418ce7-ae04-432a-8e14-4603b03dc5ef-kube-api-access-mn7dn\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.107826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-utilities\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.108002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-catalog-content\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.124093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7dn\" (UniqueName: \"kubernetes.io/projected/49418ce7-ae04-432a-8e14-4603b03dc5ef-kube-api-access-mn7dn\") pod \"community-operators-6q5x7\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.318311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:03 crc kubenswrapper[4707]: I0218 05:59:03.543706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q5x7"] Feb 18 05:59:04 crc kubenswrapper[4707]: I0218 05:59:04.034832 4707 generic.go:334] "Generic (PLEG): container finished" podID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerID="df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047" exitCode=0 Feb 18 05:59:04 crc kubenswrapper[4707]: I0218 05:59:04.034944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5x7" event={"ID":"49418ce7-ae04-432a-8e14-4603b03dc5ef","Type":"ContainerDied","Data":"df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047"} Feb 18 05:59:04 crc kubenswrapper[4707]: I0218 05:59:04.035406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5x7" event={"ID":"49418ce7-ae04-432a-8e14-4603b03dc5ef","Type":"ContainerStarted","Data":"a58372381135573017ff6d4f31028172681601bb6e9609aade5a973012f0fd04"} Feb 18 05:59:04 crc kubenswrapper[4707]: I0218 05:59:04.065624 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0086369d-ae40-4c4d-be3d-c3e3f06c93f4" path="/var/lib/kubelet/pods/0086369d-ae40-4c4d-be3d-c3e3f06c93f4/volumes" Feb 18 05:59:05 crc kubenswrapper[4707]: I0218 05:59:05.044058 4707 generic.go:334] "Generic (PLEG): container finished" podID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerID="693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc" exitCode=0 Feb 18 05:59:05 crc kubenswrapper[4707]: I0218 05:59:05.044293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5x7" event={"ID":"49418ce7-ae04-432a-8e14-4603b03dc5ef","Type":"ContainerDied","Data":"693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc"} Feb 18 05:59:06 crc kubenswrapper[4707]: I0218 05:59:06.063292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5x7" event={"ID":"49418ce7-ae04-432a-8e14-4603b03dc5ef","Type":"ContainerStarted","Data":"acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24"} Feb 18 05:59:06 crc kubenswrapper[4707]: I0218 05:59:06.081535 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6q5x7" podStartSLOduration=2.689415676 podStartE2EDuration="4.0815057s" podCreationTimestamp="2026-02-18 05:59:02 +0000 UTC" firstStartedPulling="2026-02-18 05:59:04.037314906 +0000 UTC m=+680.685274050" lastFinishedPulling="2026-02-18 05:59:05.42940494 +0000 UTC m=+682.077364074" observedRunningTime="2026-02-18 05:59:06.078097077 +0000 UTC m=+682.726056251" watchObservedRunningTime="2026-02-18 05:59:06.0815057 +0000 UTC m=+682.729464874" Feb 18 05:59:07 crc kubenswrapper[4707]: E0218 05:59:07.913216 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.17:52320->38.102.83.17:43371: write tcp 38.102.83.17:52320->38.102.83.17:43371: write: broken pipe Feb 18 05:59:08 crc kubenswrapper[4707]: E0218 05:59:08.581433 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.17:52340->38.102.83.17:43371: write tcp 38.102.83.17:52340->38.102.83.17:43371: write: broken pipe Feb 18 05:59:13 crc kubenswrapper[4707]: I0218 05:59:13.318574 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:13 crc kubenswrapper[4707]: I0218 05:59:13.318898 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:13 crc kubenswrapper[4707]: I0218 05:59:13.352786 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:14 crc kubenswrapper[4707]: I0218 05:59:14.134512 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:14 crc kubenswrapper[4707]: I0218 05:59:14.179425 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q5x7"] Feb 18 05:59:16 crc kubenswrapper[4707]: I0218 05:59:16.110885 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6q5x7" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="registry-server" containerID="cri-o://acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24" gracePeriod=2 Feb 18 05:59:16 crc kubenswrapper[4707]: I0218 05:59:16.944600 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:16 crc kubenswrapper[4707]: I0218 05:59:16.987745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn7dn\" (UniqueName: \"kubernetes.io/projected/49418ce7-ae04-432a-8e14-4603b03dc5ef-kube-api-access-mn7dn\") pod \"49418ce7-ae04-432a-8e14-4603b03dc5ef\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " Feb 18 05:59:16 crc kubenswrapper[4707]: I0218 05:59:16.987852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-catalog-content\") pod \"49418ce7-ae04-432a-8e14-4603b03dc5ef\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " Feb 18 05:59:16 crc kubenswrapper[4707]: I0218 05:59:16.987902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-utilities\") pod \"49418ce7-ae04-432a-8e14-4603b03dc5ef\" (UID: \"49418ce7-ae04-432a-8e14-4603b03dc5ef\") " Feb 18 05:59:16 crc kubenswrapper[4707]: I0218 05:59:16.989259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-utilities" (OuterVolumeSpecName: "utilities") pod "49418ce7-ae04-432a-8e14-4603b03dc5ef" (UID: "49418ce7-ae04-432a-8e14-4603b03dc5ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:16 crc kubenswrapper[4707]: I0218 05:59:16.996321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49418ce7-ae04-432a-8e14-4603b03dc5ef-kube-api-access-mn7dn" (OuterVolumeSpecName: "kube-api-access-mn7dn") pod "49418ce7-ae04-432a-8e14-4603b03dc5ef" (UID: "49418ce7-ae04-432a-8e14-4603b03dc5ef"). InnerVolumeSpecName "kube-api-access-mn7dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.048448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49418ce7-ae04-432a-8e14-4603b03dc5ef" (UID: "49418ce7-ae04-432a-8e14-4603b03dc5ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.089504 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.089704 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn7dn\" (UniqueName: \"kubernetes.io/projected/49418ce7-ae04-432a-8e14-4603b03dc5ef-kube-api-access-mn7dn\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.090120 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49418ce7-ae04-432a-8e14-4603b03dc5ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.119440 4707 generic.go:334] "Generic (PLEG): container finished" podID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerID="acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24" exitCode=0 Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.119491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5x7" event={"ID":"49418ce7-ae04-432a-8e14-4603b03dc5ef","Type":"ContainerDied","Data":"acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24"} Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.119522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5x7" event={"ID":"49418ce7-ae04-432a-8e14-4603b03dc5ef","Type":"ContainerDied","Data":"a58372381135573017ff6d4f31028172681601bb6e9609aade5a973012f0fd04"} Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.119541 4707 scope.go:117] "RemoveContainer" containerID="acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.119583 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5x7" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.137380 4707 scope.go:117] "RemoveContainer" containerID="693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.155569 4707 scope.go:117] "RemoveContainer" containerID="df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.187092 4707 scope.go:117] "RemoveContainer" containerID="acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24" Feb 18 05:59:17 crc kubenswrapper[4707]: E0218 05:59:17.187821 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24\": container with ID starting with acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24 not found: ID does not exist" containerID="acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.187868 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24"} err="failed to get container status \"acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24\": rpc error: code = NotFound desc = could not find container \"acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24\": container with ID starting with acd268abeef9902d6fb981befca1246b0356372c5bde40f6ae09e58710c97f24 not found: ID does not exist" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.187900 4707 scope.go:117] "RemoveContainer" containerID="693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc" Feb 18 05:59:17 crc kubenswrapper[4707]: E0218 05:59:17.188377 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc\": container with ID starting with 693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc not found: ID does not exist" containerID="693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.188399 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc"} err="failed to get container status \"693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc\": rpc error: code = NotFound desc = could not find container \"693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc\": container with ID starting with 693ea24bf061c41b7ec5ae2a8f1cdb9882a3edbfa76b0043deb6f766f77082cc not found: ID does not exist" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.188415 4707 scope.go:117] "RemoveContainer" containerID="df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047" Feb 18 05:59:17 crc kubenswrapper[4707]: E0218 05:59:17.188961 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047\": container with ID starting with df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047 not found: ID does not exist" containerID="df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.188995 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047"} err="failed to get container status \"df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047\": rpc error: code = NotFound desc = could not find container \"df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047\": container with ID starting with df2da4b7ec3b707accbcde57c0008aac7943a05d253c3429923e30c309e66047 not found: ID does not exist" Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.192219 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q5x7"] Feb 18 05:59:17 crc kubenswrapper[4707]: I0218 05:59:17.200822 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6q5x7"] Feb 18 05:59:18 crc kubenswrapper[4707]: I0218 05:59:18.060183 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" path="/var/lib/kubelet/pods/49418ce7-ae04-432a-8e14-4603b03dc5ef/volumes" Feb 18 05:59:21 crc kubenswrapper[4707]: I0218 05:59:21.382382 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:59:21 crc kubenswrapper[4707]: I0218 05:59:21.385161 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.796914 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n"] Feb 18 05:59:28 crc kubenswrapper[4707]: E0218 05:59:28.798174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="registry-server" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.798194 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="registry-server" Feb 18 05:59:28 crc kubenswrapper[4707]: E0218 05:59:28.798207 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="extract-content" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.798215 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="extract-content" Feb 18 05:59:28 crc kubenswrapper[4707]: E0218 05:59:28.798240 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="extract-utilities" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.798251 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="extract-utilities" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.798425 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="49418ce7-ae04-432a-8e14-4603b03dc5ef" containerName="registry-server" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.799581 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.803679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.806836 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n"] Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.843402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.843452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482zq\" (UniqueName: \"kubernetes.io/projected/250c6074-0914-405f-ac9f-59f2d01c6cf1-kube-api-access-482zq\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.843485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.944578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.944624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-482zq\" (UniqueName: \"kubernetes.io/projected/250c6074-0914-405f-ac9f-59f2d01c6cf1-kube-api-access-482zq\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.944657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.945132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.945213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:28 crc kubenswrapper[4707]: I0218 05:59:28.968577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-482zq\" (UniqueName: \"kubernetes.io/projected/250c6074-0914-405f-ac9f-59f2d01c6cf1-kube-api-access-482zq\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:29 crc kubenswrapper[4707]: I0218 05:59:29.123352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:29 crc kubenswrapper[4707]: I0218 05:59:29.555340 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n"] Feb 18 05:59:30 crc kubenswrapper[4707]: I0218 05:59:30.196590 4707 generic.go:334] "Generic (PLEG): container finished" podID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerID="955e12b273264280503c0fa8404a06f963858f65d9d8d9c5096926f80c29cd0e" exitCode=0 Feb 18 05:59:30 crc kubenswrapper[4707]: I0218 05:59:30.196648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" event={"ID":"250c6074-0914-405f-ac9f-59f2d01c6cf1","Type":"ContainerDied","Data":"955e12b273264280503c0fa8404a06f963858f65d9d8d9c5096926f80c29cd0e"} Feb 18 05:59:30 crc kubenswrapper[4707]: I0218 05:59:30.196686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" event={"ID":"250c6074-0914-405f-ac9f-59f2d01c6cf1","Type":"ContainerStarted","Data":"90d077ddd6b0fba01f81c91b6e9483e2609cc69e8b316289b82edd7e5c56b4a6"} Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.037864 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ls2vx"] Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.040983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.045975 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ls2vx"] Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.077714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzp5\" (UniqueName: \"kubernetes.io/projected/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-kube-api-access-qtzp5\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.077778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-catalog-content\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.077991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-utilities\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.179229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzp5\" (UniqueName: \"kubernetes.io/projected/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-kube-api-access-qtzp5\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.179327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-catalog-content\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.179424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-utilities\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.179961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-catalog-content\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.180267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-utilities\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.200016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzp5\" (UniqueName: \"kubernetes.io/projected/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-kube-api-access-qtzp5\") pod \"redhat-operators-ls2vx\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.361130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:31 crc kubenswrapper[4707]: I0218 05:59:31.560751 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ls2vx"] Feb 18 05:59:31 crc kubenswrapper[4707]: W0218 05:59:31.600909 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9662c94b_2ccf_4ac0_b18b_3507b1856dc6.slice/crio-7a87a27bc2b8d2b460e18e7cf4719ee1894f46ea138a325851e77faec8d90162 WatchSource:0}: Error finding container 7a87a27bc2b8d2b460e18e7cf4719ee1894f46ea138a325851e77faec8d90162: Status 404 returned error can't find the container with id 7a87a27bc2b8d2b460e18e7cf4719ee1894f46ea138a325851e77faec8d90162 Feb 18 05:59:32 crc kubenswrapper[4707]: I0218 05:59:32.211260 4707 generic.go:334] "Generic (PLEG): container finished" podID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerID="aee4c8da70dfcc977fa745f3438681e48431dd6b782febb2c476c7a102490c0d" exitCode=0 Feb 18 05:59:32 crc kubenswrapper[4707]: I0218 05:59:32.211344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" event={"ID":"250c6074-0914-405f-ac9f-59f2d01c6cf1","Type":"ContainerDied","Data":"aee4c8da70dfcc977fa745f3438681e48431dd6b782febb2c476c7a102490c0d"} Feb 18 05:59:32 crc kubenswrapper[4707]: I0218 05:59:32.213300 4707 generic.go:334] "Generic (PLEG): container finished" podID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerID="b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe" exitCode=0 Feb 18 05:59:32 crc kubenswrapper[4707]: I0218 05:59:32.213406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ls2vx" event={"ID":"9662c94b-2ccf-4ac0-b18b-3507b1856dc6","Type":"ContainerDied","Data":"b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe"} Feb 18 05:59:32 crc kubenswrapper[4707]: I0218 05:59:32.213501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ls2vx" event={"ID":"9662c94b-2ccf-4ac0-b18b-3507b1856dc6","Type":"ContainerStarted","Data":"7a87a27bc2b8d2b460e18e7cf4719ee1894f46ea138a325851e77faec8d90162"} Feb 18 05:59:33 crc kubenswrapper[4707]: I0218 05:59:33.220159 4707 generic.go:334] "Generic (PLEG): container finished" podID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerID="a26dae667dd77f656b4a5e46da3461aa9081b37be2072c5ff0af4091fb88eaa2" exitCode=0 Feb 18 05:59:33 crc kubenswrapper[4707]: I0218 05:59:33.220245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" event={"ID":"250c6074-0914-405f-ac9f-59f2d01c6cf1","Type":"ContainerDied","Data":"a26dae667dd77f656b4a5e46da3461aa9081b37be2072c5ff0af4091fb88eaa2"} Feb 18 05:59:33 crc kubenswrapper[4707]: I0218 05:59:33.223058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ls2vx" event={"ID":"9662c94b-2ccf-4ac0-b18b-3507b1856dc6","Type":"ContainerStarted","Data":"6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b"} Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.231297 4707 generic.go:334] "Generic (PLEG): container finished" podID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerID="6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b" exitCode=0 Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.231375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ls2vx" event={"ID":"9662c94b-2ccf-4ac0-b18b-3507b1856dc6","Type":"ContainerDied","Data":"6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b"} Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.446399 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.627531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-bundle\") pod \"250c6074-0914-405f-ac9f-59f2d01c6cf1\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.627856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-util\") pod \"250c6074-0914-405f-ac9f-59f2d01c6cf1\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.627961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-482zq\" (UniqueName: \"kubernetes.io/projected/250c6074-0914-405f-ac9f-59f2d01c6cf1-kube-api-access-482zq\") pod \"250c6074-0914-405f-ac9f-59f2d01c6cf1\" (UID: \"250c6074-0914-405f-ac9f-59f2d01c6cf1\") " Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.628761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-bundle" (OuterVolumeSpecName: "bundle") pod "250c6074-0914-405f-ac9f-59f2d01c6cf1" (UID: "250c6074-0914-405f-ac9f-59f2d01c6cf1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.633879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250c6074-0914-405f-ac9f-59f2d01c6cf1-kube-api-access-482zq" (OuterVolumeSpecName: "kube-api-access-482zq") pod "250c6074-0914-405f-ac9f-59f2d01c6cf1" (UID: "250c6074-0914-405f-ac9f-59f2d01c6cf1"). InnerVolumeSpecName "kube-api-access-482zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.643203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-util" (OuterVolumeSpecName: "util") pod "250c6074-0914-405f-ac9f-59f2d01c6cf1" (UID: "250c6074-0914-405f-ac9f-59f2d01c6cf1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.729059 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.729112 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/250c6074-0914-405f-ac9f-59f2d01c6cf1-util\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:34 crc kubenswrapper[4707]: I0218 05:59:34.729123 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-482zq\" (UniqueName: \"kubernetes.io/projected/250c6074-0914-405f-ac9f-59f2d01c6cf1-kube-api-access-482zq\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:35 crc kubenswrapper[4707]: I0218 05:59:35.238296 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ls2vx" event={"ID":"9662c94b-2ccf-4ac0-b18b-3507b1856dc6","Type":"ContainerStarted","Data":"114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8"} Feb 18 05:59:35 crc kubenswrapper[4707]: I0218 05:59:35.240680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" event={"ID":"250c6074-0914-405f-ac9f-59f2d01c6cf1","Type":"ContainerDied","Data":"90d077ddd6b0fba01f81c91b6e9483e2609cc69e8b316289b82edd7e5c56b4a6"} Feb 18 05:59:35 crc kubenswrapper[4707]: I0218 05:59:35.240756 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90d077ddd6b0fba01f81c91b6e9483e2609cc69e8b316289b82edd7e5c56b4a6" Feb 18 05:59:35 crc kubenswrapper[4707]: I0218 05:59:35.240904 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n" Feb 18 05:59:35 crc kubenswrapper[4707]: I0218 05:59:35.259398 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ls2vx" podStartSLOduration=1.795927193 podStartE2EDuration="4.25937897s" podCreationTimestamp="2026-02-18 05:59:31 +0000 UTC" firstStartedPulling="2026-02-18 05:59:32.214205987 +0000 UTC m=+708.862165121" lastFinishedPulling="2026-02-18 05:59:34.677657764 +0000 UTC m=+711.325616898" observedRunningTime="2026-02-18 05:59:35.257072298 +0000 UTC m=+711.905031472" watchObservedRunningTime="2026-02-18 05:59:35.25937897 +0000 UTC m=+711.907338104" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.138065 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-wjprs"] Feb 18 05:59:37 crc kubenswrapper[4707]: E0218 05:59:37.138643 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerName="util" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.138660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerName="util" Feb 18 05:59:37 crc kubenswrapper[4707]: E0218 05:59:37.138680 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerName="pull" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.138687 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerName="pull" Feb 18 05:59:37 crc kubenswrapper[4707]: E0218 05:59:37.138706 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerName="extract" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.138714 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerName="extract" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.138892 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="250c6074-0914-405f-ac9f-59f2d01c6cf1" containerName="extract" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.139322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.143249 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.143301 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mg4dm" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.143365 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.147395 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-wjprs"] Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.157601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfzs\" (UniqueName: \"kubernetes.io/projected/69e29cb2-d836-4c53-81e6-1d387d6202b9-kube-api-access-mhfzs\") pod \"nmstate-operator-694c9596b7-wjprs\" (UID: \"69e29cb2-d836-4c53-81e6-1d387d6202b9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.258435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfzs\" (UniqueName: \"kubernetes.io/projected/69e29cb2-d836-4c53-81e6-1d387d6202b9-kube-api-access-mhfzs\") pod \"nmstate-operator-694c9596b7-wjprs\" (UID: \"69e29cb2-d836-4c53-81e6-1d387d6202b9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.274469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfzs\" (UniqueName: \"kubernetes.io/projected/69e29cb2-d836-4c53-81e6-1d387d6202b9-kube-api-access-mhfzs\") pod \"nmstate-operator-694c9596b7-wjprs\" (UID: \"69e29cb2-d836-4c53-81e6-1d387d6202b9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.455405 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" Feb 18 05:59:37 crc kubenswrapper[4707]: I0218 05:59:37.639277 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-wjprs"] Feb 18 05:59:37 crc kubenswrapper[4707]: W0218 05:59:37.646744 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e29cb2_d836_4c53_81e6_1d387d6202b9.slice/crio-0d82f32144b26ccbe2bf84e75f314e784c745a7876094cdd42b2358ab997a742 WatchSource:0}: Error finding container 0d82f32144b26ccbe2bf84e75f314e784c745a7876094cdd42b2358ab997a742: Status 404 returned error can't find the container with id 0d82f32144b26ccbe2bf84e75f314e784c745a7876094cdd42b2358ab997a742 Feb 18 05:59:38 crc kubenswrapper[4707]: I0218 05:59:38.257197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" event={"ID":"69e29cb2-d836-4c53-81e6-1d387d6202b9","Type":"ContainerStarted","Data":"0d82f32144b26ccbe2bf84e75f314e784c745a7876094cdd42b2358ab997a742"} Feb 18 05:59:41 crc kubenswrapper[4707]: I0218 05:59:41.362206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:41 crc kubenswrapper[4707]: I0218 05:59:41.362754 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:41 crc kubenswrapper[4707]: I0218 05:59:41.409780 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:42 crc kubenswrapper[4707]: I0218 05:59:42.308377 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:43 crc kubenswrapper[4707]: I0218 05:59:43.280122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" event={"ID":"69e29cb2-d836-4c53-81e6-1d387d6202b9","Type":"ContainerStarted","Data":"e2b871a651bf0c8dfcd51ed4ea61198949533d5263d925680a6d1fa0bce6ae14"} Feb 18 05:59:43 crc kubenswrapper[4707]: I0218 05:59:43.299337 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-wjprs" podStartSLOduration=1.8436456959999998 podStartE2EDuration="6.299320204s" podCreationTimestamp="2026-02-18 05:59:37 +0000 UTC" firstStartedPulling="2026-02-18 05:59:37.650255384 +0000 UTC m=+714.298214518" lastFinishedPulling="2026-02-18 05:59:42.105929882 +0000 UTC m=+718.753889026" observedRunningTime="2026-02-18 05:59:43.29841624 +0000 UTC m=+719.946375374" watchObservedRunningTime="2026-02-18 05:59:43.299320204 +0000 UTC m=+719.947279338" Feb 18 05:59:43 crc kubenswrapper[4707]: I0218 05:59:43.629267 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ls2vx"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.127100 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.128178 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.134158 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.134944 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.135316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-d52b2" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.137021 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.137626 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.161724 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.176576 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dfp8q"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.177364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.238108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngj4f\" (UniqueName: \"kubernetes.io/projected/07ea2b2b-697c-491e-89fe-707d7a2f6a32-kube-api-access-ngj4f\") pod \"nmstate-metrics-58c85c668d-qmfqj\" (UID: \"07ea2b2b-697c-491e-89fe-707d7a2f6a32\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.238397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff6bca46-e4fa-443f-82c7-7995a2b6499b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-cn2n8\" (UID: \"ff6bca46-e4fa-443f-82c7-7995a2b6499b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.238495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9ppr\" (UniqueName: \"kubernetes.io/projected/ff6bca46-e4fa-443f-82c7-7995a2b6499b-kube-api-access-n9ppr\") pod \"nmstate-webhook-866bcb46dc-cn2n8\" (UID: \"ff6bca46-e4fa-443f-82c7-7995a2b6499b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.286605 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ls2vx" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="registry-server" containerID="cri-o://114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8" gracePeriod=2 Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.309052 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.309729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.311331 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.311391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.312520 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-r46wx" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.323750 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.339707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff6bca46-e4fa-443f-82c7-7995a2b6499b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-cn2n8\" (UID: \"ff6bca46-e4fa-443f-82c7-7995a2b6499b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.339764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-nmstate-lock\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.339811 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9ppr\" (UniqueName: \"kubernetes.io/projected/ff6bca46-e4fa-443f-82c7-7995a2b6499b-kube-api-access-n9ppr\") pod \"nmstate-webhook-866bcb46dc-cn2n8\" (UID: \"ff6bca46-e4fa-443f-82c7-7995a2b6499b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.339848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-dbus-socket\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.339872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngj4f\" (UniqueName: \"kubernetes.io/projected/07ea2b2b-697c-491e-89fe-707d7a2f6a32-kube-api-access-ngj4f\") pod \"nmstate-metrics-58c85c668d-qmfqj\" (UID: \"07ea2b2b-697c-491e-89fe-707d7a2f6a32\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.339900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx567\" (UniqueName: \"kubernetes.io/projected/7c0f897b-dd2e-4356-ae9d-a85bae401266-kube-api-access-xx567\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.339915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-ovs-socket\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.346974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff6bca46-e4fa-443f-82c7-7995a2b6499b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-cn2n8\" (UID: \"ff6bca46-e4fa-443f-82c7-7995a2b6499b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.365949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngj4f\" (UniqueName: \"kubernetes.io/projected/07ea2b2b-697c-491e-89fe-707d7a2f6a32-kube-api-access-ngj4f\") pod \"nmstate-metrics-58c85c668d-qmfqj\" (UID: \"07ea2b2b-697c-491e-89fe-707d7a2f6a32\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.366887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9ppr\" (UniqueName: \"kubernetes.io/projected/ff6bca46-e4fa-443f-82c7-7995a2b6499b-kube-api-access-n9ppr\") pod \"nmstate-webhook-866bcb46dc-cn2n8\" (UID: \"ff6bca46-e4fa-443f-82c7-7995a2b6499b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd51f090-835b-4d6f-9204-1564b2430039-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-dbus-socket\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx567\" (UniqueName: \"kubernetes.io/projected/7c0f897b-dd2e-4356-ae9d-a85bae401266-kube-api-access-xx567\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-ovs-socket\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-nmstate-lock\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbpf\" (UniqueName: \"kubernetes.io/projected/fd51f090-835b-4d6f-9204-1564b2430039-kube-api-access-7bbpf\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd51f090-835b-4d6f-9204-1564b2430039-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-ovs-socket\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.443502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-nmstate-lock\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.444056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c0f897b-dd2e-4356-ae9d-a85bae401266-dbus-socket\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.446905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.457454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.476928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx567\" (UniqueName: \"kubernetes.io/projected/7c0f897b-dd2e-4356-ae9d-a85bae401266-kube-api-access-xx567\") pod \"nmstate-handler-dfp8q\" (UID: \"7c0f897b-dd2e-4356-ae9d-a85bae401266\") " pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.502959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.520159 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79b486fcf8-xtl2m"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.520884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.533966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79b486fcf8-xtl2m"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.544826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbpf\" (UniqueName: \"kubernetes.io/projected/fd51f090-835b-4d6f-9204-1564b2430039-kube-api-access-7bbpf\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.544892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd51f090-835b-4d6f-9204-1564b2430039-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.544946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd51f090-835b-4d6f-9204-1564b2430039-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.545965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd51f090-835b-4d6f-9204-1564b2430039-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.555935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd51f090-835b-4d6f-9204-1564b2430039-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.568511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbpf\" (UniqueName: \"kubernetes.io/projected/fd51f090-835b-4d6f-9204-1564b2430039-kube-api-access-7bbpf\") pod \"nmstate-console-plugin-5c78fc5d65-8xc9q\" (UID: \"fd51f090-835b-4d6f-9204-1564b2430039\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.625649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.647576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6988c63a-a3aa-418e-8a95-865d274cb86b-console-oauth-config\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.647650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-service-ca\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.647706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6988c63a-a3aa-418e-8a95-865d274cb86b-console-serving-cert\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.647780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btt5k\" (UniqueName: \"kubernetes.io/projected/6988c63a-a3aa-418e-8a95-865d274cb86b-kube-api-access-btt5k\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.647856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-oauth-serving-cert\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.647917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-console-config\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.647955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-trusted-ca-bundle\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.748680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btt5k\" (UniqueName: \"kubernetes.io/projected/6988c63a-a3aa-418e-8a95-865d274cb86b-kube-api-access-btt5k\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.748729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-oauth-serving-cert\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.748765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-console-config\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.748788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-trusted-ca-bundle\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.748847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6988c63a-a3aa-418e-8a95-865d274cb86b-console-oauth-config\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.748865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-service-ca\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.748887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6988c63a-a3aa-418e-8a95-865d274cb86b-console-serving-cert\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.750539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-service-ca\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.750984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-oauth-serving-cert\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.751086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-console-config\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.752551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6988c63a-a3aa-418e-8a95-865d274cb86b-console-oauth-config\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.755642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6988c63a-a3aa-418e-8a95-865d274cb86b-console-serving-cert\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.757921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6988c63a-a3aa-418e-8a95-865d274cb86b-trusted-ca-bundle\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.766294 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.768816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btt5k\" (UniqueName: \"kubernetes.io/projected/6988c63a-a3aa-418e-8a95-865d274cb86b-kube-api-access-btt5k\") pod \"console-79b486fcf8-xtl2m\" (UID: \"6988c63a-a3aa-418e-8a95-865d274cb86b\") " pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: W0218 05:59:44.773183 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6bca46_e4fa_443f_82c7_7995a2b6499b.slice/crio-0bc8e94c74a986c6f1e84b6ee1041450999a2404348c98ca58ac315156d9292b WatchSource:0}: Error finding container 0bc8e94c74a986c6f1e84b6ee1041450999a2404348c98ca58ac315156d9292b: Status 404 returned error can't find the container with id 0bc8e94c74a986c6f1e84b6ee1041450999a2404348c98ca58ac315156d9292b Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.837257 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.841518 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.897242 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj"] Feb 18 05:59:44 crc kubenswrapper[4707]: I0218 05:59:44.997218 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79b486fcf8-xtl2m"] Feb 18 05:59:45 crc kubenswrapper[4707]: W0218 05:59:45.003216 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6988c63a_a3aa_418e_8a95_865d274cb86b.slice/crio-114fbdcf80e7f45bcca812ea5003e3feb1ff4ceb1077ad831746b787e6a39cfc WatchSource:0}: Error finding container 114fbdcf80e7f45bcca812ea5003e3feb1ff4ceb1077ad831746b787e6a39cfc: Status 404 returned error can't find the container with id 114fbdcf80e7f45bcca812ea5003e3feb1ff4ceb1077ad831746b787e6a39cfc Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.292269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" event={"ID":"07ea2b2b-697c-491e-89fe-707d7a2f6a32","Type":"ContainerStarted","Data":"663e02de4a8ffe7abc73ae8113f6703f93efb03132e305f7d0b0208150d1b283"} Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.293320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dfp8q" event={"ID":"7c0f897b-dd2e-4356-ae9d-a85bae401266","Type":"ContainerStarted","Data":"9b249520c5623441ebafc4e478cb9d859ec771ded5d9ba199afe1f7d16589afe"} Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.294162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b486fcf8-xtl2m" event={"ID":"6988c63a-a3aa-418e-8a95-865d274cb86b","Type":"ContainerStarted","Data":"114fbdcf80e7f45bcca812ea5003e3feb1ff4ceb1077ad831746b787e6a39cfc"} Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.294881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" event={"ID":"fd51f090-835b-4d6f-9204-1564b2430039","Type":"ContainerStarted","Data":"aa123b01df60df45431f4593f10586cebfdbd92506cd5db480403f75d1e4c4aa"} Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.295688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" event={"ID":"ff6bca46-e4fa-443f-82c7-7995a2b6499b","Type":"ContainerStarted","Data":"0bc8e94c74a986c6f1e84b6ee1041450999a2404348c98ca58ac315156d9292b"} Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.820570 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.965311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-utilities\") pod \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.965414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-catalog-content\") pod \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.965458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtzp5\" (UniqueName: \"kubernetes.io/projected/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-kube-api-access-qtzp5\") pod \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\" (UID: \"9662c94b-2ccf-4ac0-b18b-3507b1856dc6\") " Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.969552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-utilities" (OuterVolumeSpecName: "utilities") pod "9662c94b-2ccf-4ac0-b18b-3507b1856dc6" (UID: "9662c94b-2ccf-4ac0-b18b-3507b1856dc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:45 crc kubenswrapper[4707]: I0218 05:59:45.986596 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-kube-api-access-qtzp5" (OuterVolumeSpecName: "kube-api-access-qtzp5") pod "9662c94b-2ccf-4ac0-b18b-3507b1856dc6" (UID: "9662c94b-2ccf-4ac0-b18b-3507b1856dc6"). InnerVolumeSpecName "kube-api-access-qtzp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.066921 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtzp5\" (UniqueName: \"kubernetes.io/projected/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-kube-api-access-qtzp5\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.066944 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.093092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9662c94b-2ccf-4ac0-b18b-3507b1856dc6" (UID: "9662c94b-2ccf-4ac0-b18b-3507b1856dc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.168496 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9662c94b-2ccf-4ac0-b18b-3507b1856dc6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.308185 4707 generic.go:334] "Generic (PLEG): container finished" podID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerID="114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8" exitCode=0 Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.308258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ls2vx" event={"ID":"9662c94b-2ccf-4ac0-b18b-3507b1856dc6","Type":"ContainerDied","Data":"114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8"} Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.308284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ls2vx" event={"ID":"9662c94b-2ccf-4ac0-b18b-3507b1856dc6","Type":"ContainerDied","Data":"7a87a27bc2b8d2b460e18e7cf4719ee1894f46ea138a325851e77faec8d90162"} Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.308317 4707 scope.go:117] "RemoveContainer" containerID="114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.308444 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ls2vx" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.312444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b486fcf8-xtl2m" event={"ID":"6988c63a-a3aa-418e-8a95-865d274cb86b","Type":"ContainerStarted","Data":"c34da9f90d26061a9bc1b69353c5f892d199063f38c0d8a141e941421e63c8d1"} Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.334148 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79b486fcf8-xtl2m" podStartSLOduration=2.334128885 podStartE2EDuration="2.334128885s" podCreationTimestamp="2026-02-18 05:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 05:59:46.330471977 +0000 UTC m=+722.978431121" watchObservedRunningTime="2026-02-18 05:59:46.334128885 +0000 UTC m=+722.982088019" Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.351293 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ls2vx"] Feb 18 05:59:46 crc kubenswrapper[4707]: I0218 05:59:46.357760 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ls2vx"] Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.246962 4707 scope.go:117] "RemoveContainer" containerID="6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b" Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.279129 4707 scope.go:117] "RemoveContainer" containerID="b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe" Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.295184 4707 scope.go:117] "RemoveContainer" containerID="114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8" Feb 18 05:59:47 crc kubenswrapper[4707]: E0218 05:59:47.295664 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8\": container with ID starting with 114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8 not found: ID does not exist" containerID="114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8" Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.295698 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8"} err="failed to get container status \"114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8\": rpc error: code = NotFound desc = could not find container \"114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8\": container with ID starting with 114520895e8a8a5a97b81717cbac1d24e38a2d339bd68c9b82cc89f1ca5714e8 not found: ID does not exist" Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.295719 4707 scope.go:117] "RemoveContainer" containerID="6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b" Feb 18 05:59:47 crc kubenswrapper[4707]: E0218 05:59:47.296146 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b\": container with ID starting with 6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b not found: ID does not exist" containerID="6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b" Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.296196 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b"} err="failed to get container status \"6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b\": rpc error: code = NotFound desc = could not find container \"6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b\": container with ID starting with 6b543fea061abf7fa87af1a06ebe441bb1f9b5ba7cb619eeb35c91ffe6c43e3b not found: ID does not exist" Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.296228 4707 scope.go:117] "RemoveContainer" containerID="b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe" Feb 18 05:59:47 crc kubenswrapper[4707]: E0218 05:59:47.296780 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe\": container with ID starting with b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe not found: ID does not exist" containerID="b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe" Feb 18 05:59:47 crc kubenswrapper[4707]: I0218 05:59:47.296882 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe"} err="failed to get container status \"b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe\": rpc error: code = NotFound desc = could not find container \"b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe\": container with ID starting with b69339a4554ecbc7a0df2e75193b77de0c44fec68973e786ca38605d89f533fe not found: ID does not exist" Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.062829 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" path="/var/lib/kubelet/pods/9662c94b-2ccf-4ac0-b18b-3507b1856dc6/volumes" Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.330711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" event={"ID":"fd51f090-835b-4d6f-9204-1564b2430039","Type":"ContainerStarted","Data":"73a14ec536513cc9d88e69b71230d00d81f629a74febc4f4cfaa3f0b25d2fe7c"} Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.332571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" event={"ID":"ff6bca46-e4fa-443f-82c7-7995a2b6499b","Type":"ContainerStarted","Data":"8ba400838ba9a897b4c7b1d6de22aeaf184dc3ea6320d6d5871fe036a1264528"} Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.332680 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.334919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" event={"ID":"07ea2b2b-697c-491e-89fe-707d7a2f6a32","Type":"ContainerStarted","Data":"856159bc1044c85c19c92b2ce822ee6ab261a345159e11d046bb9b7f5bd6a5df"} Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.336762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dfp8q" event={"ID":"7c0f897b-dd2e-4356-ae9d-a85bae401266","Type":"ContainerStarted","Data":"41b71572cbf2031241c682389468424b14c15dffc9320cb71ad775cd2e5adf12"} Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.336958 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.355984 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8xc9q" podStartSLOduration=1.8922668740000002 podStartE2EDuration="4.355955805s" podCreationTimestamp="2026-02-18 05:59:44 +0000 UTC" firstStartedPulling="2026-02-18 05:59:44.842535707 +0000 UTC m=+721.490494841" lastFinishedPulling="2026-02-18 05:59:47.306224638 +0000 UTC m=+723.954183772" observedRunningTime="2026-02-18 05:59:48.346754917 +0000 UTC m=+724.994714101" watchObservedRunningTime="2026-02-18 05:59:48.355955805 +0000 UTC m=+725.003914949" Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.366597 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dfp8q" podStartSLOduration=1.624489737 podStartE2EDuration="4.36658128s" podCreationTimestamp="2026-02-18 05:59:44 +0000 UTC" firstStartedPulling="2026-02-18 05:59:44.567495365 +0000 UTC m=+721.215454499" lastFinishedPulling="2026-02-18 05:59:47.309586908 +0000 UTC m=+723.957546042" observedRunningTime="2026-02-18 05:59:48.362330085 +0000 UTC m=+725.010289219" watchObservedRunningTime="2026-02-18 05:59:48.36658128 +0000 UTC m=+725.014540414" Feb 18 05:59:48 crc kubenswrapper[4707]: I0218 05:59:48.385440 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" podStartSLOduration=1.851549943 podStartE2EDuration="4.385417096s" podCreationTimestamp="2026-02-18 05:59:44 +0000 UTC" firstStartedPulling="2026-02-18 05:59:44.774478321 +0000 UTC m=+721.422437465" lastFinishedPulling="2026-02-18 05:59:47.308345464 +0000 UTC m=+723.956304618" observedRunningTime="2026-02-18 05:59:48.378624953 +0000 UTC m=+725.026584087" watchObservedRunningTime="2026-02-18 05:59:48.385417096 +0000 UTC m=+725.033376220" Feb 18 05:59:50 crc kubenswrapper[4707]: I0218 05:59:50.355727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" event={"ID":"07ea2b2b-697c-491e-89fe-707d7a2f6a32","Type":"ContainerStarted","Data":"f91473506d3c15c69925cc71d1e1c8546ad502ffc17704f1831621603dcd9ef1"} Feb 18 05:59:50 crc kubenswrapper[4707]: I0218 05:59:50.379356 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qmfqj" podStartSLOduration=1.921831128 podStartE2EDuration="6.379321256s" podCreationTimestamp="2026-02-18 05:59:44 +0000 UTC" firstStartedPulling="2026-02-18 05:59:44.905710823 +0000 UTC m=+721.553669957" lastFinishedPulling="2026-02-18 05:59:49.363200951 +0000 UTC m=+726.011160085" observedRunningTime="2026-02-18 05:59:50.378885343 +0000 UTC m=+727.026844477" watchObservedRunningTime="2026-02-18 05:59:50.379321256 +0000 UTC m=+727.027280410" Feb 18 05:59:51 crc kubenswrapper[4707]: I0218 05:59:51.382751 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 05:59:51 crc kubenswrapper[4707]: I0218 05:59:51.382925 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 05:59:54 crc kubenswrapper[4707]: I0218 05:59:54.526092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dfp8q" Feb 18 05:59:54 crc kubenswrapper[4707]: I0218 05:59:54.842306 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:54 crc kubenswrapper[4707]: I0218 05:59:54.842360 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:54 crc kubenswrapper[4707]: I0218 05:59:54.849089 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:55 crc kubenswrapper[4707]: I0218 05:59:55.391717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79b486fcf8-xtl2m" Feb 18 05:59:55 crc kubenswrapper[4707]: I0218 05:59:55.452233 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-m2l7m"] Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.152743 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98"] Feb 18 06:00:00 crc kubenswrapper[4707]: E0218 06:00:00.153481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="extract-utilities" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.153498 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="extract-utilities" Feb 18 06:00:00 crc kubenswrapper[4707]: E0218 06:00:00.153511 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="extract-content" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.153518 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="extract-content" Feb 18 06:00:00 crc kubenswrapper[4707]: E0218 06:00:00.153533 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="registry-server" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.153539 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="registry-server" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.153643 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9662c94b-2ccf-4ac0-b18b-3507b1856dc6" containerName="registry-server" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.154073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.157186 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.159821 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.161106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98"] Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.250342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxqln\" (UniqueName: \"kubernetes.io/projected/689c82fa-0c0f-44ad-bb51-cf3769313da5-kube-api-access-wxqln\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.250396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/689c82fa-0c0f-44ad-bb51-cf3769313da5-config-volume\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.250439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/689c82fa-0c0f-44ad-bb51-cf3769313da5-secret-volume\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.351816 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxqln\" (UniqueName: \"kubernetes.io/projected/689c82fa-0c0f-44ad-bb51-cf3769313da5-kube-api-access-wxqln\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.352337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/689c82fa-0c0f-44ad-bb51-cf3769313da5-config-volume\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.352405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/689c82fa-0c0f-44ad-bb51-cf3769313da5-secret-volume\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.353688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/689c82fa-0c0f-44ad-bb51-cf3769313da5-config-volume\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.361319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/689c82fa-0c0f-44ad-bb51-cf3769313da5-secret-volume\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.368988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxqln\" (UniqueName: \"kubernetes.io/projected/689c82fa-0c0f-44ad-bb51-cf3769313da5-kube-api-access-wxqln\") pod \"collect-profiles-29523240-clj98\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.509082 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:00 crc kubenswrapper[4707]: I0218 06:00:00.735767 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98"] Feb 18 06:00:00 crc kubenswrapper[4707]: W0218 06:00:00.741949 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689c82fa_0c0f_44ad_bb51_cf3769313da5.slice/crio-48cc2e58c56db2efc20454396d3e40b68fc7d4566951734e20863cb7ca43f1c2 WatchSource:0}: Error finding container 48cc2e58c56db2efc20454396d3e40b68fc7d4566951734e20863cb7ca43f1c2: Status 404 returned error can't find the container with id 48cc2e58c56db2efc20454396d3e40b68fc7d4566951734e20863cb7ca43f1c2 Feb 18 06:00:01 crc kubenswrapper[4707]: I0218 06:00:01.430416 4707 generic.go:334] "Generic (PLEG): container finished" podID="689c82fa-0c0f-44ad-bb51-cf3769313da5" containerID="a71b2691e9a4be1cc9f471fa6a8055333e2bdd69f90cb6711ecb5def43d1abb4" exitCode=0 Feb 18 06:00:01 crc kubenswrapper[4707]: I0218 06:00:01.430495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" event={"ID":"689c82fa-0c0f-44ad-bb51-cf3769313da5","Type":"ContainerDied","Data":"a71b2691e9a4be1cc9f471fa6a8055333e2bdd69f90cb6711ecb5def43d1abb4"} Feb 18 06:00:01 crc kubenswrapper[4707]: I0218 06:00:01.430560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" event={"ID":"689c82fa-0c0f-44ad-bb51-cf3769313da5","Type":"ContainerStarted","Data":"48cc2e58c56db2efc20454396d3e40b68fc7d4566951734e20863cb7ca43f1c2"} Feb 18 06:00:02 crc kubenswrapper[4707]: I0218 06:00:02.739045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:02 crc kubenswrapper[4707]: I0218 06:00:02.912417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/689c82fa-0c0f-44ad-bb51-cf3769313da5-config-volume\") pod \"689c82fa-0c0f-44ad-bb51-cf3769313da5\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " Feb 18 06:00:02 crc kubenswrapper[4707]: I0218 06:00:02.912633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/689c82fa-0c0f-44ad-bb51-cf3769313da5-secret-volume\") pod \"689c82fa-0c0f-44ad-bb51-cf3769313da5\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " Feb 18 06:00:02 crc kubenswrapper[4707]: I0218 06:00:02.912747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxqln\" (UniqueName: \"kubernetes.io/projected/689c82fa-0c0f-44ad-bb51-cf3769313da5-kube-api-access-wxqln\") pod \"689c82fa-0c0f-44ad-bb51-cf3769313da5\" (UID: \"689c82fa-0c0f-44ad-bb51-cf3769313da5\") " Feb 18 06:00:02 crc kubenswrapper[4707]: I0218 06:00:02.913579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/689c82fa-0c0f-44ad-bb51-cf3769313da5-config-volume" (OuterVolumeSpecName: "config-volume") pod "689c82fa-0c0f-44ad-bb51-cf3769313da5" (UID: "689c82fa-0c0f-44ad-bb51-cf3769313da5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:00:02 crc kubenswrapper[4707]: I0218 06:00:02.918412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689c82fa-0c0f-44ad-bb51-cf3769313da5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "689c82fa-0c0f-44ad-bb51-cf3769313da5" (UID: "689c82fa-0c0f-44ad-bb51-cf3769313da5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:00:02 crc kubenswrapper[4707]: I0218 06:00:02.919122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689c82fa-0c0f-44ad-bb51-cf3769313da5-kube-api-access-wxqln" (OuterVolumeSpecName: "kube-api-access-wxqln") pod "689c82fa-0c0f-44ad-bb51-cf3769313da5" (UID: "689c82fa-0c0f-44ad-bb51-cf3769313da5"). InnerVolumeSpecName "kube-api-access-wxqln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:03 crc kubenswrapper[4707]: I0218 06:00:03.014450 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxqln\" (UniqueName: \"kubernetes.io/projected/689c82fa-0c0f-44ad-bb51-cf3769313da5-kube-api-access-wxqln\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:03 crc kubenswrapper[4707]: I0218 06:00:03.014493 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/689c82fa-0c0f-44ad-bb51-cf3769313da5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:03 crc kubenswrapper[4707]: I0218 06:00:03.014503 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/689c82fa-0c0f-44ad-bb51-cf3769313da5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:03 crc kubenswrapper[4707]: I0218 06:00:03.441625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" event={"ID":"689c82fa-0c0f-44ad-bb51-cf3769313da5","Type":"ContainerDied","Data":"48cc2e58c56db2efc20454396d3e40b68fc7d4566951734e20863cb7ca43f1c2"} Feb 18 06:00:03 crc kubenswrapper[4707]: I0218 06:00:03.441660 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48cc2e58c56db2efc20454396d3e40b68fc7d4566951734e20863cb7ca43f1c2" Feb 18 06:00:03 crc kubenswrapper[4707]: I0218 06:00:03.441677 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98" Feb 18 06:00:04 crc kubenswrapper[4707]: I0218 06:00:04.463540 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-cn2n8" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.487175 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85"] Feb 18 06:00:15 crc kubenswrapper[4707]: E0218 06:00:15.488049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689c82fa-0c0f-44ad-bb51-cf3769313da5" containerName="collect-profiles" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.488066 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="689c82fa-0c0f-44ad-bb51-cf3769313da5" containerName="collect-profiles" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.488200 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="689c82fa-0c0f-44ad-bb51-cf3769313da5" containerName="collect-profiles" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.489389 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.491373 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.493144 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85"] Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.686402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.686485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxs4h\" (UniqueName: \"kubernetes.io/projected/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-kube-api-access-sxs4h\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.686513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.787627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.787714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxs4h\" (UniqueName: \"kubernetes.io/projected/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-kube-api-access-sxs4h\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.787751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.788193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.788266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:15 crc kubenswrapper[4707]: I0218 06:00:15.818588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxs4h\" (UniqueName: \"kubernetes.io/projected/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-kube-api-access-sxs4h\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:16 crc kubenswrapper[4707]: I0218 06:00:16.116459 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:16 crc kubenswrapper[4707]: I0218 06:00:16.308850 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85"] Feb 18 06:00:16 crc kubenswrapper[4707]: I0218 06:00:16.513948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" event={"ID":"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2","Type":"ContainerStarted","Data":"0179add9b12407905fd7f2bcb6bb96dbd6f04f56b30cc5855499ab0b6d34b66f"} Feb 18 06:00:16 crc kubenswrapper[4707]: I0218 06:00:16.514312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" event={"ID":"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2","Type":"ContainerStarted","Data":"e87efa3ed4750315227caae02bc449bf700d2dd3f8a47f3304c8beabecf4f528"} Feb 18 06:00:17 crc kubenswrapper[4707]: I0218 06:00:17.522531 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerID="0179add9b12407905fd7f2bcb6bb96dbd6f04f56b30cc5855499ab0b6d34b66f" exitCode=0 Feb 18 06:00:17 crc kubenswrapper[4707]: I0218 06:00:17.522575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" event={"ID":"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2","Type":"ContainerDied","Data":"0179add9b12407905fd7f2bcb6bb96dbd6f04f56b30cc5855499ab0b6d34b66f"} Feb 18 06:00:19 crc kubenswrapper[4707]: I0218 06:00:19.536152 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerID="10f98beb1946ecfe91f2dc576606187c2d589beb7390c4729d6d649a19832cf5" exitCode=0 Feb 18 06:00:19 crc kubenswrapper[4707]: I0218 06:00:19.536288 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" event={"ID":"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2","Type":"ContainerDied","Data":"10f98beb1946ecfe91f2dc576606187c2d589beb7390c4729d6d649a19832cf5"} Feb 18 06:00:20 crc kubenswrapper[4707]: I0218 06:00:20.497080 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-m2l7m" podUID="a8657192-49a2-4c45-bc94-bbc3e2e608af" containerName="console" containerID="cri-o://1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6" gracePeriod=15 Feb 18 06:00:20 crc kubenswrapper[4707]: I0218 06:00:20.549172 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerID="467640c565584074219c32a2189dab2a8535dad9525fc6b5d1a2ad7aac862176" exitCode=0 Feb 18 06:00:20 crc kubenswrapper[4707]: I0218 06:00:20.549231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" event={"ID":"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2","Type":"ContainerDied","Data":"467640c565584074219c32a2189dab2a8535dad9525fc6b5d1a2ad7aac862176"} Feb 18 06:00:20 crc kubenswrapper[4707]: I0218 06:00:20.884455 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-m2l7m_a8657192-49a2-4c45-bc94-bbc3e2e608af/console/0.log" Feb 18 06:00:20 crc kubenswrapper[4707]: I0218 06:00:20.884872 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.053571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-trusted-ca-bundle\") pod \"a8657192-49a2-4c45-bc94-bbc3e2e608af\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.053628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-serving-cert\") pod \"a8657192-49a2-4c45-bc94-bbc3e2e608af\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.053658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbfcc\" (UniqueName: \"kubernetes.io/projected/a8657192-49a2-4c45-bc94-bbc3e2e608af-kube-api-access-gbfcc\") pod \"a8657192-49a2-4c45-bc94-bbc3e2e608af\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.053727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-service-ca\") pod \"a8657192-49a2-4c45-bc94-bbc3e2e608af\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.054561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a8657192-49a2-4c45-bc94-bbc3e2e608af" (UID: "a8657192-49a2-4c45-bc94-bbc3e2e608af"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.054601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-service-ca" (OuterVolumeSpecName: "service-ca") pod "a8657192-49a2-4c45-bc94-bbc3e2e608af" (UID: "a8657192-49a2-4c45-bc94-bbc3e2e608af"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.054877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-oauth-config\") pod \"a8657192-49a2-4c45-bc94-bbc3e2e608af\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.054986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-config\") pod \"a8657192-49a2-4c45-bc94-bbc3e2e608af\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.055051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-oauth-serving-cert\") pod \"a8657192-49a2-4c45-bc94-bbc3e2e608af\" (UID: \"a8657192-49a2-4c45-bc94-bbc3e2e608af\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.055389 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.055405 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.055438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-config" (OuterVolumeSpecName: "console-config") pod "a8657192-49a2-4c45-bc94-bbc3e2e608af" (UID: "a8657192-49a2-4c45-bc94-bbc3e2e608af"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.055544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a8657192-49a2-4c45-bc94-bbc3e2e608af" (UID: "a8657192-49a2-4c45-bc94-bbc3e2e608af"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.059901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a8657192-49a2-4c45-bc94-bbc3e2e608af" (UID: "a8657192-49a2-4c45-bc94-bbc3e2e608af"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.060072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8657192-49a2-4c45-bc94-bbc3e2e608af-kube-api-access-gbfcc" (OuterVolumeSpecName: "kube-api-access-gbfcc") pod "a8657192-49a2-4c45-bc94-bbc3e2e608af" (UID: "a8657192-49a2-4c45-bc94-bbc3e2e608af"). InnerVolumeSpecName "kube-api-access-gbfcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.064936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a8657192-49a2-4c45-bc94-bbc3e2e608af" (UID: "a8657192-49a2-4c45-bc94-bbc3e2e608af"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.156351 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.156398 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbfcc\" (UniqueName: \"kubernetes.io/projected/a8657192-49a2-4c45-bc94-bbc3e2e608af-kube-api-access-gbfcc\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.156413 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.156424 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.156436 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8657192-49a2-4c45-bc94-bbc3e2e608af-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.382659 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.382734 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.382840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.383590 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e32285faa40540ae5e3f1a855f8e56122b182016a9d9b345bafd252cfac4ced1"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.383673 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://e32285faa40540ae5e3f1a855f8e56122b182016a9d9b345bafd252cfac4ced1" gracePeriod=600 Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.557366 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-m2l7m_a8657192-49a2-4c45-bc94-bbc3e2e608af/console/0.log" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.557442 4707 generic.go:334] "Generic (PLEG): container finished" podID="a8657192-49a2-4c45-bc94-bbc3e2e608af" containerID="1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6" exitCode=2 Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.557559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m2l7m" event={"ID":"a8657192-49a2-4c45-bc94-bbc3e2e608af","Type":"ContainerDied","Data":"1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6"} Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.557556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m2l7m" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.557602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m2l7m" event={"ID":"a8657192-49a2-4c45-bc94-bbc3e2e608af","Type":"ContainerDied","Data":"af23d1a942c6640c491cf4f05425f98e4d16a9329a0f9aee15ca80280522bfdb"} Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.557632 4707 scope.go:117] "RemoveContainer" containerID="1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.562850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"e32285faa40540ae5e3f1a855f8e56122b182016a9d9b345bafd252cfac4ced1"} Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.562850 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="e32285faa40540ae5e3f1a855f8e56122b182016a9d9b345bafd252cfac4ced1" exitCode=0 Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.579923 4707 scope.go:117] "RemoveContainer" containerID="1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6" Feb 18 06:00:21 crc kubenswrapper[4707]: E0218 06:00:21.580435 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6\": container with ID starting with 1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6 not found: ID does not exist" containerID="1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.580518 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6"} err="failed to get container status \"1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6\": rpc error: code = NotFound desc = could not find container \"1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6\": container with ID starting with 1e6300ecb82d5e8ace4ccd689b1c36396ec96e7db9733678c5258c3ceebf59a6 not found: ID does not exist" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.580568 4707 scope.go:117] "RemoveContainer" containerID="d49d6adeee4f1d9b81111c98288055c114a1e0649058be818dd7f0b90f16510a" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.606298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-m2l7m"] Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.626437 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-m2l7m"] Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.798959 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.965199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxs4h\" (UniqueName: \"kubernetes.io/projected/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-kube-api-access-sxs4h\") pod \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.965580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-bundle\") pod \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.965622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-util\") pod \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\" (UID: \"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2\") " Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.966364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-bundle" (OuterVolumeSpecName: "bundle") pod "cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" (UID: "cc452865-bae4-4e28-ac49-ecc5bdd1a5c2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.970354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-kube-api-access-sxs4h" (OuterVolumeSpecName: "kube-api-access-sxs4h") pod "cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" (UID: "cc452865-bae4-4e28-ac49-ecc5bdd1a5c2"). InnerVolumeSpecName "kube-api-access-sxs4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:21 crc kubenswrapper[4707]: I0218 06:00:21.974912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-util" (OuterVolumeSpecName: "util") pod "cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" (UID: "cc452865-bae4-4e28-ac49-ecc5bdd1a5c2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.068604 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxs4h\" (UniqueName: \"kubernetes.io/projected/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-kube-api-access-sxs4h\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.068834 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.069192 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc452865-bae4-4e28-ac49-ecc5bdd1a5c2-util\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.077851 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8657192-49a2-4c45-bc94-bbc3e2e608af" path="/var/lib/kubelet/pods/a8657192-49a2-4c45-bc94-bbc3e2e608af/volumes" Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.570006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"9c49c62c491f92ccee8fba7d855c10e4bbd43134f1f4f54ad1885c2004d7c90c"} Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.572247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" event={"ID":"cc452865-bae4-4e28-ac49-ecc5bdd1a5c2","Type":"ContainerDied","Data":"e87efa3ed4750315227caae02bc449bf700d2dd3f8a47f3304c8beabecf4f528"} Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.572282 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87efa3ed4750315227caae02bc449bf700d2dd3f8a47f3304c8beabecf4f528" Feb 18 06:00:22 crc kubenswrapper[4707]: I0218 06:00:22.572345 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.005643 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nlhnq"] Feb 18 06:00:24 crc kubenswrapper[4707]: E0218 06:00:24.006371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerName="pull" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.006393 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerName="pull" Feb 18 06:00:24 crc kubenswrapper[4707]: E0218 06:00:24.006428 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8657192-49a2-4c45-bc94-bbc3e2e608af" containerName="console" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.006441 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8657192-49a2-4c45-bc94-bbc3e2e608af" containerName="console" Feb 18 06:00:24 crc kubenswrapper[4707]: E0218 06:00:24.006473 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerName="extract" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.006488 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerName="extract" Feb 18 06:00:24 crc kubenswrapper[4707]: E0218 06:00:24.006513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerName="util" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.006527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerName="util" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.006727 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc452865-bae4-4e28-ac49-ecc5bdd1a5c2" containerName="extract" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.006753 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8657192-49a2-4c45-bc94-bbc3e2e608af" containerName="console" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.008092 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlhnq"] Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.008219 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.091244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkhh\" (UniqueName: \"kubernetes.io/projected/eefb8976-adb7-48c8-9293-fa7ea3e59160-kube-api-access-tfkhh\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.091350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-catalog-content\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.091491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-utilities\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.192567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkhh\" (UniqueName: \"kubernetes.io/projected/eefb8976-adb7-48c8-9293-fa7ea3e59160-kube-api-access-tfkhh\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.192649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-catalog-content\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.192692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-utilities\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.193115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-catalog-content\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.193175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-utilities\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.215005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkhh\" (UniqueName: \"kubernetes.io/projected/eefb8976-adb7-48c8-9293-fa7ea3e59160-kube-api-access-tfkhh\") pod \"certified-operators-nlhnq\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.327227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:24 crc kubenswrapper[4707]: I0218 06:00:24.598108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlhnq"] Feb 18 06:00:25 crc kubenswrapper[4707]: I0218 06:00:25.598086 4707 generic.go:334] "Generic (PLEG): container finished" podID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerID="752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384" exitCode=0 Feb 18 06:00:25 crc kubenswrapper[4707]: I0218 06:00:25.598154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhnq" event={"ID":"eefb8976-adb7-48c8-9293-fa7ea3e59160","Type":"ContainerDied","Data":"752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384"} Feb 18 06:00:25 crc kubenswrapper[4707]: I0218 06:00:25.598200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhnq" event={"ID":"eefb8976-adb7-48c8-9293-fa7ea3e59160","Type":"ContainerStarted","Data":"af64840528c44ae889f8136cac80a8677e52a38830673b0c328df95586129fdb"} Feb 18 06:00:26 crc kubenswrapper[4707]: I0218 06:00:26.605016 4707 generic.go:334] "Generic (PLEG): container finished" podID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerID="0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99" exitCode=0 Feb 18 06:00:26 crc kubenswrapper[4707]: I0218 06:00:26.605111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhnq" event={"ID":"eefb8976-adb7-48c8-9293-fa7ea3e59160","Type":"ContainerDied","Data":"0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99"} Feb 18 06:00:27 crc kubenswrapper[4707]: I0218 06:00:27.620380 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhnq" event={"ID":"eefb8976-adb7-48c8-9293-fa7ea3e59160","Type":"ContainerStarted","Data":"76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788"} Feb 18 06:00:27 crc kubenswrapper[4707]: I0218 06:00:27.647076 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nlhnq" podStartSLOduration=3.272599402 podStartE2EDuration="4.647058765s" podCreationTimestamp="2026-02-18 06:00:23 +0000 UTC" firstStartedPulling="2026-02-18 06:00:25.60192799 +0000 UTC m=+762.249887124" lastFinishedPulling="2026-02-18 06:00:26.976387353 +0000 UTC m=+763.624346487" observedRunningTime="2026-02-18 06:00:27.64426809 +0000 UTC m=+764.292227244" watchObservedRunningTime="2026-02-18 06:00:27.647058765 +0000 UTC m=+764.295017899" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.518043 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x"] Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.519294 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.523508 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.524144 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.524708 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.524707 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vpcwc" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.525977 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.542847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x"] Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.608445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stqxx\" (UniqueName: \"kubernetes.io/projected/a755d713-37ff-463f-81d6-aa0bfc05c654-kube-api-access-stqxx\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.608533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a755d713-37ff-463f-81d6-aa0bfc05c654-webhook-cert\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.608594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a755d713-37ff-463f-81d6-aa0bfc05c654-apiservice-cert\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.709706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a755d713-37ff-463f-81d6-aa0bfc05c654-apiservice-cert\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.709888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stqxx\" (UniqueName: \"kubernetes.io/projected/a755d713-37ff-463f-81d6-aa0bfc05c654-kube-api-access-stqxx\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.709946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a755d713-37ff-463f-81d6-aa0bfc05c654-webhook-cert\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.716129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a755d713-37ff-463f-81d6-aa0bfc05c654-webhook-cert\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.716147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a755d713-37ff-463f-81d6-aa0bfc05c654-apiservice-cert\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.730497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stqxx\" (UniqueName: \"kubernetes.io/projected/a755d713-37ff-463f-81d6-aa0bfc05c654-kube-api-access-stqxx\") pod \"metallb-operator-controller-manager-7cd6fc9664-wtj2x\" (UID: \"a755d713-37ff-463f-81d6-aa0bfc05c654\") " pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.836640 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.958987 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79944b854-l7jrs"] Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.959979 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.965171 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nfwg2" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.965359 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.965485 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 06:00:33 crc kubenswrapper[4707]: I0218 06:00:33.970446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79944b854-l7jrs"] Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.094882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x"] Feb 18 06:00:34 crc kubenswrapper[4707]: W0218 06:00:34.105989 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda755d713_37ff_463f_81d6_aa0bfc05c654.slice/crio-f081aed18f75b9d96c0811f9944b028c4efb5e4f0d84a2fb4a407cf0bb8dc097 WatchSource:0}: Error finding container f081aed18f75b9d96c0811f9944b028c4efb5e4f0d84a2fb4a407cf0bb8dc097: Status 404 returned error can't find the container with id f081aed18f75b9d96c0811f9944b028c4efb5e4f0d84a2fb4a407cf0bb8dc097 Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.123636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-webhook-cert\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.123691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl94\" (UniqueName: \"kubernetes.io/projected/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-kube-api-access-kkl94\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.123750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-apiservice-cert\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.225420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-webhook-cert\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.225477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl94\" (UniqueName: \"kubernetes.io/projected/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-kube-api-access-kkl94\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.225542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-apiservice-cert\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.230331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-apiservice-cert\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.234047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-webhook-cert\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.244490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl94\" (UniqueName: \"kubernetes.io/projected/44e90078-03e0-4691-ba27-cbd9c5ab9cbe-kube-api-access-kkl94\") pod \"metallb-operator-webhook-server-79944b854-l7jrs\" (UID: \"44e90078-03e0-4691-ba27-cbd9c5ab9cbe\") " pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.293377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.328379 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.328494 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.384441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.653555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" event={"ID":"a755d713-37ff-463f-81d6-aa0bfc05c654","Type":"ContainerStarted","Data":"f081aed18f75b9d96c0811f9944b028c4efb5e4f0d84a2fb4a407cf0bb8dc097"} Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.688494 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:34 crc kubenswrapper[4707]: I0218 06:00:34.705601 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79944b854-l7jrs"] Feb 18 06:00:34 crc kubenswrapper[4707]: W0218 06:00:34.707573 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e90078_03e0_4691_ba27_cbd9c5ab9cbe.slice/crio-c7ff14a361ebef5cbdf20577e9ffa7485c2c9ebf96f37dcd26d3a1df2e89bfe6 WatchSource:0}: Error finding container c7ff14a361ebef5cbdf20577e9ffa7485c2c9ebf96f37dcd26d3a1df2e89bfe6: Status 404 returned error can't find the container with id c7ff14a361ebef5cbdf20577e9ffa7485c2c9ebf96f37dcd26d3a1df2e89bfe6 Feb 18 06:00:35 crc kubenswrapper[4707]: I0218 06:00:35.384146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlhnq"] Feb 18 06:00:35 crc kubenswrapper[4707]: I0218 06:00:35.661871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" event={"ID":"44e90078-03e0-4691-ba27-cbd9c5ab9cbe","Type":"ContainerStarted","Data":"c7ff14a361ebef5cbdf20577e9ffa7485c2c9ebf96f37dcd26d3a1df2e89bfe6"} Feb 18 06:00:36 crc kubenswrapper[4707]: I0218 06:00:36.667772 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nlhnq" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="registry-server" containerID="cri-o://76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788" gracePeriod=2 Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.186961 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.369601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-utilities\") pod \"eefb8976-adb7-48c8-9293-fa7ea3e59160\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.369641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfkhh\" (UniqueName: \"kubernetes.io/projected/eefb8976-adb7-48c8-9293-fa7ea3e59160-kube-api-access-tfkhh\") pod \"eefb8976-adb7-48c8-9293-fa7ea3e59160\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.369728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-catalog-content\") pod \"eefb8976-adb7-48c8-9293-fa7ea3e59160\" (UID: \"eefb8976-adb7-48c8-9293-fa7ea3e59160\") " Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.371361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-utilities" (OuterVolumeSpecName: "utilities") pod "eefb8976-adb7-48c8-9293-fa7ea3e59160" (UID: "eefb8976-adb7-48c8-9293-fa7ea3e59160"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.376012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eefb8976-adb7-48c8-9293-fa7ea3e59160-kube-api-access-tfkhh" (OuterVolumeSpecName: "kube-api-access-tfkhh") pod "eefb8976-adb7-48c8-9293-fa7ea3e59160" (UID: "eefb8976-adb7-48c8-9293-fa7ea3e59160"). InnerVolumeSpecName "kube-api-access-tfkhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.416275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eefb8976-adb7-48c8-9293-fa7ea3e59160" (UID: "eefb8976-adb7-48c8-9293-fa7ea3e59160"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.470738 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.470773 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eefb8976-adb7-48c8-9293-fa7ea3e59160-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.470784 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfkhh\" (UniqueName: \"kubernetes.io/projected/eefb8976-adb7-48c8-9293-fa7ea3e59160-kube-api-access-tfkhh\") on node \"crc\" DevicePath \"\"" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.684427 4707 generic.go:334] "Generic (PLEG): container finished" podID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerID="76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788" exitCode=0 Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.684557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhnq" event={"ID":"eefb8976-adb7-48c8-9293-fa7ea3e59160","Type":"ContainerDied","Data":"76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788"} Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.684599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhnq" event={"ID":"eefb8976-adb7-48c8-9293-fa7ea3e59160","Type":"ContainerDied","Data":"af64840528c44ae889f8136cac80a8677e52a38830673b0c328df95586129fdb"} Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.684627 4707 scope.go:117] "RemoveContainer" containerID="76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.684849 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhnq" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.690982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" event={"ID":"a755d713-37ff-463f-81d6-aa0bfc05c654","Type":"ContainerStarted","Data":"3c0c1dcaa6ca9507bd57519ffe1e7ef5c2931d0a6535691a8c16a974720d80b7"} Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.691816 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.725521 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" podStartSLOduration=1.794206867 podStartE2EDuration="4.72550237s" podCreationTimestamp="2026-02-18 06:00:33 +0000 UTC" firstStartedPulling="2026-02-18 06:00:34.109204941 +0000 UTC m=+770.757164075" lastFinishedPulling="2026-02-18 06:00:37.040500444 +0000 UTC m=+773.688459578" observedRunningTime="2026-02-18 06:00:37.720416064 +0000 UTC m=+774.368375198" watchObservedRunningTime="2026-02-18 06:00:37.72550237 +0000 UTC m=+774.373461504" Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.740540 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlhnq"] Feb 18 06:00:37 crc kubenswrapper[4707]: I0218 06:00:37.745817 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nlhnq"] Feb 18 06:00:38 crc kubenswrapper[4707]: I0218 06:00:38.060634 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" path="/var/lib/kubelet/pods/eefb8976-adb7-48c8-9293-fa7ea3e59160/volumes" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.062579 4707 scope.go:117] "RemoveContainer" containerID="0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.109739 4707 scope.go:117] "RemoveContainer" containerID="752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.123842 4707 scope.go:117] "RemoveContainer" containerID="76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788" Feb 18 06:00:39 crc kubenswrapper[4707]: E0218 06:00:39.124138 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788\": container with ID starting with 76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788 not found: ID does not exist" containerID="76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.124168 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788"} err="failed to get container status \"76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788\": rpc error: code = NotFound desc = could not find container \"76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788\": container with ID starting with 76e83a07193191e0c1fd6d408137e43fbfdd7c9f98fc28b51808c89c04119788 not found: ID does not exist" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.124188 4707 scope.go:117] "RemoveContainer" containerID="0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99" Feb 18 06:00:39 crc kubenswrapper[4707]: E0218 06:00:39.124544 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99\": container with ID starting with 0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99 not found: ID does not exist" containerID="0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.124561 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99"} err="failed to get container status \"0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99\": rpc error: code = NotFound desc = could not find container \"0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99\": container with ID starting with 0704fcff8f3ced6ee02446289fce2420250254dfecac637896546bb89258fc99 not found: ID does not exist" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.124573 4707 scope.go:117] "RemoveContainer" containerID="752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384" Feb 18 06:00:39 crc kubenswrapper[4707]: E0218 06:00:39.124766 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384\": container with ID starting with 752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384 not found: ID does not exist" containerID="752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.124781 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384"} err="failed to get container status \"752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384\": rpc error: code = NotFound desc = could not find container \"752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384\": container with ID starting with 752508eb95ff6cd21cc5518a66737a17c8e1b15de49fd09f7b1f86b56ef0c384 not found: ID does not exist" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.703704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" event={"ID":"44e90078-03e0-4691-ba27-cbd9c5ab9cbe","Type":"ContainerStarted","Data":"3a699ea708c802be2dd703b5cb329e186458f76ad68d66bc5d9804ed53146314"} Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.704110 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:00:39 crc kubenswrapper[4707]: I0218 06:00:39.720201 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" podStartSLOduration=2.291237879 podStartE2EDuration="6.720181701s" podCreationTimestamp="2026-02-18 06:00:33 +0000 UTC" firstStartedPulling="2026-02-18 06:00:34.709285309 +0000 UTC m=+771.357244443" lastFinishedPulling="2026-02-18 06:00:39.138229131 +0000 UTC m=+775.786188265" observedRunningTime="2026-02-18 06:00:39.720036487 +0000 UTC m=+776.367995621" watchObservedRunningTime="2026-02-18 06:00:39.720181701 +0000 UTC m=+776.368140835" Feb 18 06:00:54 crc kubenswrapper[4707]: I0218 06:00:54.298766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79944b854-l7jrs" Feb 18 06:01:13 crc kubenswrapper[4707]: I0218 06:01:13.839198 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7cd6fc9664-wtj2x" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.502856 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5z4hx"] Feb 18 06:01:14 crc kubenswrapper[4707]: E0218 06:01:14.503394 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="registry-server" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.503412 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="registry-server" Feb 18 06:01:14 crc kubenswrapper[4707]: E0218 06:01:14.503426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="extract-utilities" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.503434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="extract-utilities" Feb 18 06:01:14 crc kubenswrapper[4707]: E0218 06:01:14.503446 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="extract-content" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.503454 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="extract-content" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.503585 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eefb8976-adb7-48c8-9293-fa7ea3e59160" containerName="registry-server" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.506315 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.510188 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.510185 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.510537 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-t7jqr" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.513742 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9"] Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.514446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.522075 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.527864 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9"] Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.621785 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lh4s7"] Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.622687 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.624759 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.625662 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-z6bzh" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.625854 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.627398 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-metrics\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-frr-conf\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-reloader\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-kt4g9\" (UID: \"a3bf8bf8-cafc-49e2-b284-33d016f8bb50\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9bnx\" (UniqueName: \"kubernetes.io/projected/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-kube-api-access-v9bnx\") pod \"frr-k8s-webhook-server-78b44bf5bb-kt4g9\" (UID: \"a3bf8bf8-cafc-49e2-b284-33d016f8bb50\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-frr-sockets\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7260955d-03d4-4758-8159-b7c648865b62-frr-startup\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7260955d-03d4-4758-8159-b7c648865b62-metrics-certs\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.661416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s922r\" (UniqueName: \"kubernetes.io/projected/7260955d-03d4-4758-8159-b7c648865b62-kube-api-access-s922r\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.673558 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-4gf26"] Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.674679 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.679053 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.689581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-4gf26"] Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9bnx\" (UniqueName: \"kubernetes.io/projected/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-kube-api-access-v9bnx\") pod \"frr-k8s-webhook-server-78b44bf5bb-kt4g9\" (UID: \"a3bf8bf8-cafc-49e2-b284-33d016f8bb50\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-frr-sockets\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7260955d-03d4-4758-8159-b7c648865b62-frr-startup\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7260955d-03d4-4758-8159-b7c648865b62-metrics-certs\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-metrics-certs\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s922r\" (UniqueName: \"kubernetes.io/projected/7260955d-03d4-4758-8159-b7c648865b62-kube-api-access-s922r\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c3112677-66f0-45d3-9281-094cd5c11163-metallb-excludel2\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-metrics\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-frr-conf\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.762885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj54c\" (UniqueName: \"kubernetes.io/projected/c3112677-66f0-45d3-9281-094cd5c11163-kube-api-access-hj54c\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.763005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-reloader\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.763140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-kt4g9\" (UID: \"a3bf8bf8-cafc-49e2-b284-33d016f8bb50\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.763671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-metrics\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: E0218 06:01:14.763280 4707 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 18 06:01:14 crc kubenswrapper[4707]: E0218 06:01:14.764082 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-cert podName:a3bf8bf8-cafc-49e2-b284-33d016f8bb50 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:15.263787054 +0000 UTC m=+811.911746188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-cert") pod "frr-k8s-webhook-server-78b44bf5bb-kt4g9" (UID: "a3bf8bf8-cafc-49e2-b284-33d016f8bb50") : secret "frr-k8s-webhook-server-cert" not found Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.765543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-frr-sockets\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.767112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-frr-conf\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.768144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7260955d-03d4-4758-8159-b7c648865b62-frr-startup\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.771139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7260955d-03d4-4758-8159-b7c648865b62-reloader\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.783489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7260955d-03d4-4758-8159-b7c648865b62-metrics-certs\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.786472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9bnx\" (UniqueName: \"kubernetes.io/projected/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-kube-api-access-v9bnx\") pod \"frr-k8s-webhook-server-78b44bf5bb-kt4g9\" (UID: \"a3bf8bf8-cafc-49e2-b284-33d016f8bb50\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.787176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s922r\" (UniqueName: \"kubernetes.io/projected/7260955d-03d4-4758-8159-b7c648865b62-kube-api-access-s922r\") pod \"frr-k8s-5z4hx\" (UID: \"7260955d-03d4-4758-8159-b7c648865b62\") " pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.825812 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.864153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9027d820-8aca-4e3b-84f0-4b81be566548-metrics-certs\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.864199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj54c\" (UniqueName: \"kubernetes.io/projected/c3112677-66f0-45d3-9281-094cd5c11163-kube-api-access-hj54c\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.864264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.864285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9027d820-8aca-4e3b-84f0-4b81be566548-cert\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.864313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-metrics-certs\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.864346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8gn\" (UniqueName: \"kubernetes.io/projected/9027d820-8aca-4e3b-84f0-4b81be566548-kube-api-access-sj8gn\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.864370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c3112677-66f0-45d3-9281-094cd5c11163-metallb-excludel2\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: E0218 06:01:14.864381 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 06:01:14 crc kubenswrapper[4707]: E0218 06:01:14.864437 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist podName:c3112677-66f0-45d3-9281-094cd5c11163 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:15.364419996 +0000 UTC m=+812.012379130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist") pod "speaker-lh4s7" (UID: "c3112677-66f0-45d3-9281-094cd5c11163") : secret "metallb-memberlist" not found Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.865023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c3112677-66f0-45d3-9281-094cd5c11163-metallb-excludel2\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.870554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-metrics-certs\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.879462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj54c\" (UniqueName: \"kubernetes.io/projected/c3112677-66f0-45d3-9281-094cd5c11163-kube-api-access-hj54c\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.966416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9027d820-8aca-4e3b-84f0-4b81be566548-cert\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.966511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8gn\" (UniqueName: \"kubernetes.io/projected/9027d820-8aca-4e3b-84f0-4b81be566548-kube-api-access-sj8gn\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.966579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9027d820-8aca-4e3b-84f0-4b81be566548-metrics-certs\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.968148 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.973683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9027d820-8aca-4e3b-84f0-4b81be566548-metrics-certs\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.982210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9027d820-8aca-4e3b-84f0-4b81be566548-cert\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.985072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8gn\" (UniqueName: \"kubernetes.io/projected/9027d820-8aca-4e3b-84f0-4b81be566548-kube-api-access-sj8gn\") pod \"controller-69bbfbf88f-4gf26\" (UID: \"9027d820-8aca-4e3b-84f0-4b81be566548\") " pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:14 crc kubenswrapper[4707]: I0218 06:01:14.993714 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.270669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-kt4g9\" (UID: \"a3bf8bf8-cafc-49e2-b284-33d016f8bb50\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.274233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3bf8bf8-cafc-49e2-b284-33d016f8bb50-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-kt4g9\" (UID: \"a3bf8bf8-cafc-49e2-b284-33d016f8bb50\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.371877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:15 crc kubenswrapper[4707]: E0218 06:01:15.372077 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 06:01:15 crc kubenswrapper[4707]: E0218 06:01:15.372343 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist podName:c3112677-66f0-45d3-9281-094cd5c11163 nodeName:}" failed. No retries permitted until 2026-02-18 06:01:16.372324999 +0000 UTC m=+813.020284133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist") pod "speaker-lh4s7" (UID: "c3112677-66f0-45d3-9281-094cd5c11163") : secret "metallb-memberlist" not found Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.383818 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-4gf26"] Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.437496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.646007 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9"] Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.901046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" event={"ID":"a3bf8bf8-cafc-49e2-b284-33d016f8bb50","Type":"ContainerStarted","Data":"990834da55864c399fe355ecff78f0967f6a13e964146e698edf98baf62c784f"} Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.903033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-4gf26" event={"ID":"9027d820-8aca-4e3b-84f0-4b81be566548","Type":"ContainerStarted","Data":"ecf313157daf54a8b74c0bdd79695cfc89bafe44200581871ff0c6def20b5cfb"} Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.903113 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.903129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-4gf26" event={"ID":"9027d820-8aca-4e3b-84f0-4b81be566548","Type":"ContainerStarted","Data":"ac1821c74fa7c50f4adc8f461db2c94dd712ccb7846ef60dc725faf0deca4866"} Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.903145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-4gf26" event={"ID":"9027d820-8aca-4e3b-84f0-4b81be566548","Type":"ContainerStarted","Data":"91129fc67f77583b758df8b28590aca3b7830c3a5f17f7517bb1486e2103bfc8"} Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.904075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerStarted","Data":"fde58364fd912db0a27bcc95cf3266581fed8a874f8b138cae509e2c5ccdc89b"} Feb 18 06:01:15 crc kubenswrapper[4707]: I0218 06:01:15.920001 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-4gf26" podStartSLOduration=1.919979706 podStartE2EDuration="1.919979706s" podCreationTimestamp="2026-02-18 06:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:01:15.916503053 +0000 UTC m=+812.564462177" watchObservedRunningTime="2026-02-18 06:01:15.919979706 +0000 UTC m=+812.567938850" Feb 18 06:01:16 crc kubenswrapper[4707]: I0218 06:01:16.385255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:16 crc kubenswrapper[4707]: I0218 06:01:16.403947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3112677-66f0-45d3-9281-094cd5c11163-memberlist\") pod \"speaker-lh4s7\" (UID: \"c3112677-66f0-45d3-9281-094cd5c11163\") " pod="metallb-system/speaker-lh4s7" Feb 18 06:01:16 crc kubenswrapper[4707]: I0218 06:01:16.436150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lh4s7" Feb 18 06:01:16 crc kubenswrapper[4707]: W0218 06:01:16.467588 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3112677_66f0_45d3_9281_094cd5c11163.slice/crio-297fed64bed35da5304f63aa91df1d0d76a3c81e185f105a0380c01238065965 WatchSource:0}: Error finding container 297fed64bed35da5304f63aa91df1d0d76a3c81e185f105a0380c01238065965: Status 404 returned error can't find the container with id 297fed64bed35da5304f63aa91df1d0d76a3c81e185f105a0380c01238065965 Feb 18 06:01:16 crc kubenswrapper[4707]: I0218 06:01:16.917753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lh4s7" event={"ID":"c3112677-66f0-45d3-9281-094cd5c11163","Type":"ContainerStarted","Data":"6a84e52d9530928d9b8d966d16a4786ce644050cfd1a0c4c20ff778db7686157"} Feb 18 06:01:16 crc kubenswrapper[4707]: I0218 06:01:16.917810 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lh4s7" event={"ID":"c3112677-66f0-45d3-9281-094cd5c11163","Type":"ContainerStarted","Data":"297fed64bed35da5304f63aa91df1d0d76a3c81e185f105a0380c01238065965"} Feb 18 06:01:17 crc kubenswrapper[4707]: I0218 06:01:17.944190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lh4s7" event={"ID":"c3112677-66f0-45d3-9281-094cd5c11163","Type":"ContainerStarted","Data":"706047a35e3628e70b0c97d3ca8fb106574e35c4b6f0a46ad7d1db80897e9924"} Feb 18 06:01:17 crc kubenswrapper[4707]: I0218 06:01:17.944586 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lh4s7" Feb 18 06:01:17 crc kubenswrapper[4707]: I0218 06:01:17.965906 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lh4s7" podStartSLOduration=3.965891763 podStartE2EDuration="3.965891763s" podCreationTimestamp="2026-02-18 06:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:01:17.962428681 +0000 UTC m=+814.610387815" watchObservedRunningTime="2026-02-18 06:01:17.965891763 +0000 UTC m=+814.613850897" Feb 18 06:01:22 crc kubenswrapper[4707]: I0218 06:01:22.978018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" event={"ID":"a3bf8bf8-cafc-49e2-b284-33d016f8bb50","Type":"ContainerStarted","Data":"7bef8e7828114b6472df8b1e19267e546921253e912ec4e732412e64fc730e7e"} Feb 18 06:01:22 crc kubenswrapper[4707]: I0218 06:01:22.978581 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:22 crc kubenswrapper[4707]: I0218 06:01:22.979664 4707 generic.go:334] "Generic (PLEG): container finished" podID="7260955d-03d4-4758-8159-b7c648865b62" containerID="2a54ecfde5a02c050a5390c8d81269fb909fe88d2e036b606e51af4ba48bc3b3" exitCode=0 Feb 18 06:01:22 crc kubenswrapper[4707]: I0218 06:01:22.979703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerDied","Data":"2a54ecfde5a02c050a5390c8d81269fb909fe88d2e036b606e51af4ba48bc3b3"} Feb 18 06:01:23 crc kubenswrapper[4707]: I0218 06:01:23.027598 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" podStartSLOduration=2.723882174 podStartE2EDuration="9.027580015s" podCreationTimestamp="2026-02-18 06:01:14 +0000 UTC" firstStartedPulling="2026-02-18 06:01:15.656263262 +0000 UTC m=+812.304222396" lastFinishedPulling="2026-02-18 06:01:21.959961103 +0000 UTC m=+818.607920237" observedRunningTime="2026-02-18 06:01:23.002086674 +0000 UTC m=+819.650045808" watchObservedRunningTime="2026-02-18 06:01:23.027580015 +0000 UTC m=+819.675539169" Feb 18 06:01:23 crc kubenswrapper[4707]: I0218 06:01:23.987293 4707 generic.go:334] "Generic (PLEG): container finished" podID="7260955d-03d4-4758-8159-b7c648865b62" containerID="a4baabb41840f8dd50f7b8a8d6c0dab33fd94e310768f0c255927fc6699080c4" exitCode=0 Feb 18 06:01:23 crc kubenswrapper[4707]: I0218 06:01:23.987394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerDied","Data":"a4baabb41840f8dd50f7b8a8d6c0dab33fd94e310768f0c255927fc6699080c4"} Feb 18 06:01:24 crc kubenswrapper[4707]: I0218 06:01:24.995097 4707 generic.go:334] "Generic (PLEG): container finished" podID="7260955d-03d4-4758-8159-b7c648865b62" containerID="ad914a8108ce8b537fc603efe5c9d0a6c7a2809028a8c1d9dd1f07beb6b13b8f" exitCode=0 Feb 18 06:01:24 crc kubenswrapper[4707]: I0218 06:01:24.995147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerDied","Data":"ad914a8108ce8b537fc603efe5c9d0a6c7a2809028a8c1d9dd1f07beb6b13b8f"} Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.007259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerStarted","Data":"5da8adb30e750a49169da3e5ec9eb97b9a4683689a8fe922744244390ec3019e"} Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.007630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerStarted","Data":"53e1eb22404a5e79b350a07327aec55a4b6cdc318af47c8d432af8fdb4cda979"} Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.007643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerStarted","Data":"683323f07a3d88a08e9b4bd91355e04f68998d5f77eedf7eaa204a5ebca0b6d9"} Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.007652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerStarted","Data":"f001eb1477348397a5bb479672bd6849cc5bdb3cbd44e03995f0af26928c7dd6"} Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.007665 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.007674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerStarted","Data":"9f50c0b2bf1d4f3f8b877b6691eb5d800bdd3cd4496ab6368f71715baab84a22"} Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.007682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5z4hx" event={"ID":"7260955d-03d4-4758-8159-b7c648865b62","Type":"ContainerStarted","Data":"3e7a53c6b46fca496c08ec0950fce3d04109b0b02f77a39eb9a2221bff93912a"} Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.032699 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5z4hx" podStartSLOduration=5.043062449 podStartE2EDuration="12.032680515s" podCreationTimestamp="2026-02-18 06:01:14 +0000 UTC" firstStartedPulling="2026-02-18 06:01:14.974357935 +0000 UTC m=+811.622317069" lastFinishedPulling="2026-02-18 06:01:21.963976001 +0000 UTC m=+818.611935135" observedRunningTime="2026-02-18 06:01:26.028760501 +0000 UTC m=+822.676719645" watchObservedRunningTime="2026-02-18 06:01:26.032680515 +0000 UTC m=+822.680639649" Feb 18 06:01:26 crc kubenswrapper[4707]: I0218 06:01:26.440766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lh4s7" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.065183 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wcchd"] Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.067344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wcchd" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.070118 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.070340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dh7tm" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.070500 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.070739 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wcchd"] Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.166573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv56k\" (UniqueName: \"kubernetes.io/projected/cfefbf33-047c-4e0f-b917-48ad5360344c-kube-api-access-vv56k\") pod \"openstack-operator-index-wcchd\" (UID: \"cfefbf33-047c-4e0f-b917-48ad5360344c\") " pod="openstack-operators/openstack-operator-index-wcchd" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.267597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv56k\" (UniqueName: \"kubernetes.io/projected/cfefbf33-047c-4e0f-b917-48ad5360344c-kube-api-access-vv56k\") pod \"openstack-operator-index-wcchd\" (UID: \"cfefbf33-047c-4e0f-b917-48ad5360344c\") " pod="openstack-operators/openstack-operator-index-wcchd" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.288158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv56k\" (UniqueName: \"kubernetes.io/projected/cfefbf33-047c-4e0f-b917-48ad5360344c-kube-api-access-vv56k\") pod \"openstack-operator-index-wcchd\" (UID: \"cfefbf33-047c-4e0f-b917-48ad5360344c\") " pod="openstack-operators/openstack-operator-index-wcchd" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.421451 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wcchd" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.815512 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wcchd"] Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.824870 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.826479 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:29 crc kubenswrapper[4707]: I0218 06:01:29.878929 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:30 crc kubenswrapper[4707]: I0218 06:01:30.031458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wcchd" event={"ID":"cfefbf33-047c-4e0f-b917-48ad5360344c","Type":"ContainerStarted","Data":"b76c95727378c33cee39706009f42af35ffbfe58f777ed7cb8d2f9152c5f4270"} Feb 18 06:01:32 crc kubenswrapper[4707]: I0218 06:01:32.238952 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wcchd"] Feb 18 06:01:32 crc kubenswrapper[4707]: I0218 06:01:32.846762 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nnvqj"] Feb 18 06:01:32 crc kubenswrapper[4707]: I0218 06:01:32.847663 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:32 crc kubenswrapper[4707]: I0218 06:01:32.857945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nnvqj"] Feb 18 06:01:32 crc kubenswrapper[4707]: I0218 06:01:32.948691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f59mf\" (UniqueName: \"kubernetes.io/projected/ccede39b-e3bf-4e86-9b9e-bbdc1b13a349-kube-api-access-f59mf\") pod \"openstack-operator-index-nnvqj\" (UID: \"ccede39b-e3bf-4e86-9b9e-bbdc1b13a349\") " pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:33 crc kubenswrapper[4707]: I0218 06:01:33.049844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f59mf\" (UniqueName: \"kubernetes.io/projected/ccede39b-e3bf-4e86-9b9e-bbdc1b13a349-kube-api-access-f59mf\") pod \"openstack-operator-index-nnvqj\" (UID: \"ccede39b-e3bf-4e86-9b9e-bbdc1b13a349\") " pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:33 crc kubenswrapper[4707]: I0218 06:01:33.050166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wcchd" event={"ID":"cfefbf33-047c-4e0f-b917-48ad5360344c","Type":"ContainerStarted","Data":"765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3"} Feb 18 06:01:33 crc kubenswrapper[4707]: I0218 06:01:33.069589 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wcchd" podStartSLOduration=1.9134080390000001 podStartE2EDuration="4.069565614s" podCreationTimestamp="2026-02-18 06:01:29 +0000 UTC" firstStartedPulling="2026-02-18 06:01:29.824577899 +0000 UTC m=+826.472537033" lastFinishedPulling="2026-02-18 06:01:31.980735474 +0000 UTC m=+828.628694608" observedRunningTime="2026-02-18 06:01:33.064635613 +0000 UTC m=+829.712594757" watchObservedRunningTime="2026-02-18 06:01:33.069565614 +0000 UTC m=+829.717524748" Feb 18 06:01:33 crc kubenswrapper[4707]: I0218 06:01:33.092784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f59mf\" (UniqueName: \"kubernetes.io/projected/ccede39b-e3bf-4e86-9b9e-bbdc1b13a349-kube-api-access-f59mf\") pod \"openstack-operator-index-nnvqj\" (UID: \"ccede39b-e3bf-4e86-9b9e-bbdc1b13a349\") " pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:33 crc kubenswrapper[4707]: I0218 06:01:33.168823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:33 crc kubenswrapper[4707]: I0218 06:01:33.565984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nnvqj"] Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.057660 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wcchd" podUID="cfefbf33-047c-4e0f-b917-48ad5360344c" containerName="registry-server" containerID="cri-o://765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3" gracePeriod=2 Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.060188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nnvqj" event={"ID":"ccede39b-e3bf-4e86-9b9e-bbdc1b13a349","Type":"ContainerStarted","Data":"df2e58a9ca02550fb305e40241dd9287b24b806c72f7fcf73c360d0f2267c0d1"} Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.060230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nnvqj" event={"ID":"ccede39b-e3bf-4e86-9b9e-bbdc1b13a349","Type":"ContainerStarted","Data":"11128cd15e92c06943fde4a20775b965f14218ce3b5152813b79b784963c3533"} Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.086713 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nnvqj" podStartSLOduration=2.044010127 podStartE2EDuration="2.086692998s" podCreationTimestamp="2026-02-18 06:01:32 +0000 UTC" firstStartedPulling="2026-02-18 06:01:33.565781206 +0000 UTC m=+830.213740340" lastFinishedPulling="2026-02-18 06:01:33.608464077 +0000 UTC m=+830.256423211" observedRunningTime="2026-02-18 06:01:34.084729564 +0000 UTC m=+830.732688709" watchObservedRunningTime="2026-02-18 06:01:34.086692998 +0000 UTC m=+830.734652132" Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.423449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wcchd" Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.571536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv56k\" (UniqueName: \"kubernetes.io/projected/cfefbf33-047c-4e0f-b917-48ad5360344c-kube-api-access-vv56k\") pod \"cfefbf33-047c-4e0f-b917-48ad5360344c\" (UID: \"cfefbf33-047c-4e0f-b917-48ad5360344c\") " Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.583155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfefbf33-047c-4e0f-b917-48ad5360344c-kube-api-access-vv56k" (OuterVolumeSpecName: "kube-api-access-vv56k") pod "cfefbf33-047c-4e0f-b917-48ad5360344c" (UID: "cfefbf33-047c-4e0f-b917-48ad5360344c"). InnerVolumeSpecName "kube-api-access-vv56k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.673969 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv56k\" (UniqueName: \"kubernetes.io/projected/cfefbf33-047c-4e0f-b917-48ad5360344c-kube-api-access-vv56k\") on node \"crc\" DevicePath \"\"" Feb 18 06:01:34 crc kubenswrapper[4707]: I0218 06:01:34.998234 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-4gf26" Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.063770 4707 generic.go:334] "Generic (PLEG): container finished" podID="cfefbf33-047c-4e0f-b917-48ad5360344c" containerID="765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3" exitCode=0 Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.063839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wcchd" event={"ID":"cfefbf33-047c-4e0f-b917-48ad5360344c","Type":"ContainerDied","Data":"765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3"} Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.063884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wcchd" event={"ID":"cfefbf33-047c-4e0f-b917-48ad5360344c","Type":"ContainerDied","Data":"b76c95727378c33cee39706009f42af35ffbfe58f777ed7cb8d2f9152c5f4270"} Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.063850 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wcchd" Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.063904 4707 scope.go:117] "RemoveContainer" containerID="765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3" Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.081742 4707 scope.go:117] "RemoveContainer" containerID="765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3" Feb 18 06:01:35 crc kubenswrapper[4707]: E0218 06:01:35.082340 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3\": container with ID starting with 765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3 not found: ID does not exist" containerID="765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3" Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.082385 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3"} err="failed to get container status \"765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3\": rpc error: code = NotFound desc = could not find container \"765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3\": container with ID starting with 765bfa35684ceb21dd58a1b18cc5d7ed5e2778728d116fa54e2ba4b50bf766f3 not found: ID does not exist" Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.097447 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wcchd"] Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.102496 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wcchd"] Feb 18 06:01:35 crc kubenswrapper[4707]: I0218 06:01:35.450252 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kt4g9" Feb 18 06:01:36 crc kubenswrapper[4707]: I0218 06:01:36.060089 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfefbf33-047c-4e0f-b917-48ad5360344c" path="/var/lib/kubelet/pods/cfefbf33-047c-4e0f-b917-48ad5360344c/volumes" Feb 18 06:01:43 crc kubenswrapper[4707]: I0218 06:01:43.169058 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:43 crc kubenswrapper[4707]: I0218 06:01:43.169778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:43 crc kubenswrapper[4707]: I0218 06:01:43.197158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:44 crc kubenswrapper[4707]: I0218 06:01:44.150816 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nnvqj" Feb 18 06:01:44 crc kubenswrapper[4707]: I0218 06:01:44.830675 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5z4hx" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.331211 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76"] Feb 18 06:01:51 crc kubenswrapper[4707]: E0218 06:01:51.331946 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfefbf33-047c-4e0f-b917-48ad5360344c" containerName="registry-server" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.331961 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfefbf33-047c-4e0f-b917-48ad5360344c" containerName="registry-server" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.332072 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfefbf33-047c-4e0f-b917-48ad5360344c" containerName="registry-server" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.332918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.336213 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xsdfm" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.337206 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76"] Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.444819 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-util\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.444911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fm8h\" (UniqueName: \"kubernetes.io/projected/4a393b6f-ea10-4977-827a-be170d705fff-kube-api-access-8fm8h\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.445008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-bundle\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.546500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-util\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.546552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fm8h\" (UniqueName: \"kubernetes.io/projected/4a393b6f-ea10-4977-827a-be170d705fff-kube-api-access-8fm8h\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.546577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-bundle\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.547043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-util\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.547096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-bundle\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.565477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fm8h\" (UniqueName: \"kubernetes.io/projected/4a393b6f-ea10-4977-827a-be170d705fff-kube-api-access-8fm8h\") pod \"c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:51 crc kubenswrapper[4707]: I0218 06:01:51.656295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:52 crc kubenswrapper[4707]: I0218 06:01:52.074088 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76"] Feb 18 06:01:52 crc kubenswrapper[4707]: W0218 06:01:52.082312 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a393b6f_ea10_4977_827a_be170d705fff.slice/crio-7a4610329745f6397f167cae6f79c7858b4c798fd416138ba27a0d3614a5deb9 WatchSource:0}: Error finding container 7a4610329745f6397f167cae6f79c7858b4c798fd416138ba27a0d3614a5deb9: Status 404 returned error can't find the container with id 7a4610329745f6397f167cae6f79c7858b4c798fd416138ba27a0d3614a5deb9 Feb 18 06:01:52 crc kubenswrapper[4707]: I0218 06:01:52.175600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" event={"ID":"4a393b6f-ea10-4977-827a-be170d705fff","Type":"ContainerStarted","Data":"7a4610329745f6397f167cae6f79c7858b4c798fd416138ba27a0d3614a5deb9"} Feb 18 06:01:53 crc kubenswrapper[4707]: I0218 06:01:53.185965 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a393b6f-ea10-4977-827a-be170d705fff" containerID="9483fe01dc7f79bab9428e0a9e4ebabc9056318a2a568dc342efd605789e1daf" exitCode=0 Feb 18 06:01:53 crc kubenswrapper[4707]: I0218 06:01:53.186039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" event={"ID":"4a393b6f-ea10-4977-827a-be170d705fff","Type":"ContainerDied","Data":"9483fe01dc7f79bab9428e0a9e4ebabc9056318a2a568dc342efd605789e1daf"} Feb 18 06:01:55 crc kubenswrapper[4707]: I0218 06:01:55.206907 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a393b6f-ea10-4977-827a-be170d705fff" containerID="cde240a06a82dc1e164077cd57f02e1f092121ad48080de7719bd10cedc898ac" exitCode=0 Feb 18 06:01:55 crc kubenswrapper[4707]: I0218 06:01:55.207010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" event={"ID":"4a393b6f-ea10-4977-827a-be170d705fff","Type":"ContainerDied","Data":"cde240a06a82dc1e164077cd57f02e1f092121ad48080de7719bd10cedc898ac"} Feb 18 06:01:56 crc kubenswrapper[4707]: I0218 06:01:56.217238 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a393b6f-ea10-4977-827a-be170d705fff" containerID="24232570a3bdcaa3e53b2dc5409b5934621d73709488d00dfdcd4c4b1433be6e" exitCode=0 Feb 18 06:01:56 crc kubenswrapper[4707]: I0218 06:01:56.217295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" event={"ID":"4a393b6f-ea10-4977-827a-be170d705fff","Type":"ContainerDied","Data":"24232570a3bdcaa3e53b2dc5409b5934621d73709488d00dfdcd4c4b1433be6e"} Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.489761 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.527441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-util\") pod \"4a393b6f-ea10-4977-827a-be170d705fff\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.527598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fm8h\" (UniqueName: \"kubernetes.io/projected/4a393b6f-ea10-4977-827a-be170d705fff-kube-api-access-8fm8h\") pod \"4a393b6f-ea10-4977-827a-be170d705fff\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.527679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-bundle\") pod \"4a393b6f-ea10-4977-827a-be170d705fff\" (UID: \"4a393b6f-ea10-4977-827a-be170d705fff\") " Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.528577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-bundle" (OuterVolumeSpecName: "bundle") pod "4a393b6f-ea10-4977-827a-be170d705fff" (UID: "4a393b6f-ea10-4977-827a-be170d705fff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.533637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a393b6f-ea10-4977-827a-be170d705fff-kube-api-access-8fm8h" (OuterVolumeSpecName: "kube-api-access-8fm8h") pod "4a393b6f-ea10-4977-827a-be170d705fff" (UID: "4a393b6f-ea10-4977-827a-be170d705fff"). InnerVolumeSpecName "kube-api-access-8fm8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.578195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-util" (OuterVolumeSpecName: "util") pod "4a393b6f-ea10-4977-827a-be170d705fff" (UID: "4a393b6f-ea10-4977-827a-be170d705fff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.629071 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fm8h\" (UniqueName: \"kubernetes.io/projected/4a393b6f-ea10-4977-827a-be170d705fff-kube-api-access-8fm8h\") on node \"crc\" DevicePath \"\"" Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.629114 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:01:57 crc kubenswrapper[4707]: I0218 06:01:57.629124 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4a393b6f-ea10-4977-827a-be170d705fff-util\") on node \"crc\" DevicePath \"\"" Feb 18 06:01:58 crc kubenswrapper[4707]: I0218 06:01:58.236757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" event={"ID":"4a393b6f-ea10-4977-827a-be170d705fff","Type":"ContainerDied","Data":"7a4610329745f6397f167cae6f79c7858b4c798fd416138ba27a0d3614a5deb9"} Feb 18 06:01:58 crc kubenswrapper[4707]: I0218 06:01:58.236855 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a4610329745f6397f167cae6f79c7858b4c798fd416138ba27a0d3614a5deb9" Feb 18 06:01:58 crc kubenswrapper[4707]: I0218 06:01:58.236861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.111069 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp"] Feb 18 06:02:04 crc kubenswrapper[4707]: E0218 06:02:04.111644 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a393b6f-ea10-4977-827a-be170d705fff" containerName="pull" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.111658 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a393b6f-ea10-4977-827a-be170d705fff" containerName="pull" Feb 18 06:02:04 crc kubenswrapper[4707]: E0218 06:02:04.111678 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a393b6f-ea10-4977-827a-be170d705fff" containerName="extract" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.111685 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a393b6f-ea10-4977-827a-be170d705fff" containerName="extract" Feb 18 06:02:04 crc kubenswrapper[4707]: E0218 06:02:04.111696 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a393b6f-ea10-4977-827a-be170d705fff" containerName="util" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.111704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a393b6f-ea10-4977-827a-be170d705fff" containerName="util" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.111854 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a393b6f-ea10-4977-827a-be170d705fff" containerName="extract" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.112359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.114904 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-v8m7m" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.131639 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp"] Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.216446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcwql\" (UniqueName: \"kubernetes.io/projected/b70a612f-7e0b-4187-82b0-404c913ce3d4-kube-api-access-hcwql\") pod \"openstack-operator-controller-init-766dc4fc6-q9dtp\" (UID: \"b70a612f-7e0b-4187-82b0-404c913ce3d4\") " pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.318126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcwql\" (UniqueName: \"kubernetes.io/projected/b70a612f-7e0b-4187-82b0-404c913ce3d4-kube-api-access-hcwql\") pod \"openstack-operator-controller-init-766dc4fc6-q9dtp\" (UID: \"b70a612f-7e0b-4187-82b0-404c913ce3d4\") " pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.342107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcwql\" (UniqueName: \"kubernetes.io/projected/b70a612f-7e0b-4187-82b0-404c913ce3d4-kube-api-access-hcwql\") pod \"openstack-operator-controller-init-766dc4fc6-q9dtp\" (UID: \"b70a612f-7e0b-4187-82b0-404c913ce3d4\") " pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.485475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" Feb 18 06:02:04 crc kubenswrapper[4707]: I0218 06:02:04.885382 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp"] Feb 18 06:02:05 crc kubenswrapper[4707]: I0218 06:02:05.283531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" event={"ID":"b70a612f-7e0b-4187-82b0-404c913ce3d4","Type":"ContainerStarted","Data":"9245ea05833e30ede08eaf902de67cad73b26dfd788710ab4483744343f13d4d"} Feb 18 06:02:09 crc kubenswrapper[4707]: I0218 06:02:09.348881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" event={"ID":"b70a612f-7e0b-4187-82b0-404c913ce3d4","Type":"ContainerStarted","Data":"7b8948907f7893e23a919417e5bf3a848fcc84bbe1caf6e9e884ca3f0ff4d435"} Feb 18 06:02:09 crc kubenswrapper[4707]: I0218 06:02:09.349533 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" Feb 18 06:02:09 crc kubenswrapper[4707]: I0218 06:02:09.381438 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" podStartSLOduration=2.018869789 podStartE2EDuration="5.381419194s" podCreationTimestamp="2026-02-18 06:02:04 +0000 UTC" firstStartedPulling="2026-02-18 06:02:04.893934977 +0000 UTC m=+861.541894111" lastFinishedPulling="2026-02-18 06:02:08.256484382 +0000 UTC m=+864.904443516" observedRunningTime="2026-02-18 06:02:09.379375629 +0000 UTC m=+866.027334793" watchObservedRunningTime="2026-02-18 06:02:09.381419194 +0000 UTC m=+866.029378328" Feb 18 06:02:14 crc kubenswrapper[4707]: I0218 06:02:14.487375 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-766dc4fc6-q9dtp" Feb 18 06:02:21 crc kubenswrapper[4707]: I0218 06:02:21.382783 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:02:21 crc kubenswrapper[4707]: I0218 06:02:21.383584 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.111964 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.114523 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.117132 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.118239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.123675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lp5f5" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.123808 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-w2645" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.127641 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.135479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.146148 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-fnz67"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.149396 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.152104 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-29lkh" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.165985 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-fnz67"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.172192 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-244nk"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.205624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.216512 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-m4dls" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.237903 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.247457 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.263884 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.271312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5x27s" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.285824 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-244nk"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.299287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k7vt\" (UniqueName: \"kubernetes.io/projected/4fcd0bf8-cf6a-45c0-862b-5554daa34c21-kube-api-access-9k7vt\") pod \"cinder-operator-controller-manager-5d946d989d-rnhbn\" (UID: \"4fcd0bf8-cf6a-45c0-862b-5554daa34c21\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.299355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hklmk\" (UniqueName: \"kubernetes.io/projected/6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742-kube-api-access-hklmk\") pod \"heat-operator-controller-manager-69f49c598c-244nk\" (UID: \"6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.299421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw4m9\" (UniqueName: \"kubernetes.io/projected/67bdd3cc-ee7d-4e79-8568-75502788aa1d-kube-api-access-xw4m9\") pod \"barbican-operator-controller-manager-868647ff47-qj27r\" (UID: \"67bdd3cc-ee7d-4e79-8568-75502788aa1d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.299450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4qcl\" (UniqueName: \"kubernetes.io/projected/8f61ada5-7374-4801-89b2-c95aec2e52ab-kube-api-access-p4qcl\") pod \"glance-operator-controller-manager-77987464f4-fnz67\" (UID: \"8f61ada5-7374-4801-89b2-c95aec2e52ab\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.313408 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.314255 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.322876 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-8gngt"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.323611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.323690 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.340225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-sbr4t" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.340450 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.340558 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-lxb5b" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.343853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.344861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.355284 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.356147 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.366817 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-8gngt"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.373151 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9w4np" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.373618 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-65g8b" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.382232 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.382570 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.382655 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.383629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.393339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2t5rg" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.398432 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.403862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hklmk\" (UniqueName: \"kubernetes.io/projected/6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742-kube-api-access-hklmk\") pod \"heat-operator-controller-manager-69f49c598c-244nk\" (UID: \"6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.404400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96mns\" (UniqueName: \"kubernetes.io/projected/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-kube-api-access-96mns\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.404599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw86k\" (UniqueName: \"kubernetes.io/projected/1a236879-9c6a-4604-b5bc-024b7dfd5161-kube-api-access-hw86k\") pod \"horizon-operator-controller-manager-5b9b8895d5-85hj5\" (UID: \"1a236879-9c6a-4604-b5bc-024b7dfd5161\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.404628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.404695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw4m9\" (UniqueName: \"kubernetes.io/projected/67bdd3cc-ee7d-4e79-8568-75502788aa1d-kube-api-access-xw4m9\") pod \"barbican-operator-controller-manager-868647ff47-qj27r\" (UID: \"67bdd3cc-ee7d-4e79-8568-75502788aa1d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.405451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4qcl\" (UniqueName: \"kubernetes.io/projected/8f61ada5-7374-4801-89b2-c95aec2e52ab-kube-api-access-p4qcl\") pod \"glance-operator-controller-manager-77987464f4-fnz67\" (UID: \"8f61ada5-7374-4801-89b2-c95aec2e52ab\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.405546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg7v6\" (UniqueName: \"kubernetes.io/projected/bc6f5234-aab6-43ea-89e1-a3f785742a89-kube-api-access-qg7v6\") pod \"designate-operator-controller-manager-6d8bf5c495-6lrh7\" (UID: \"bc6f5234-aab6-43ea-89e1-a3f785742a89\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.406974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k7vt\" (UniqueName: \"kubernetes.io/projected/4fcd0bf8-cf6a-45c0-862b-5554daa34c21-kube-api-access-9k7vt\") pod \"cinder-operator-controller-manager-5d946d989d-rnhbn\" (UID: \"4fcd0bf8-cf6a-45c0-862b-5554daa34c21\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.413857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.414932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.420990 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-86gqs" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.438294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4qcl\" (UniqueName: \"kubernetes.io/projected/8f61ada5-7374-4801-89b2-c95aec2e52ab-kube-api-access-p4qcl\") pod \"glance-operator-controller-manager-77987464f4-fnz67\" (UID: \"8f61ada5-7374-4801-89b2-c95aec2e52ab\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.442501 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.454107 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.455469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw4m9\" (UniqueName: \"kubernetes.io/projected/67bdd3cc-ee7d-4e79-8568-75502788aa1d-kube-api-access-xw4m9\") pod \"barbican-operator-controller-manager-868647ff47-qj27r\" (UID: \"67bdd3cc-ee7d-4e79-8568-75502788aa1d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.464677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.465060 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.488497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k7vt\" (UniqueName: \"kubernetes.io/projected/4fcd0bf8-cf6a-45c0-862b-5554daa34c21-kube-api-access-9k7vt\") pod \"cinder-operator-controller-manager-5d946d989d-rnhbn\" (UID: \"4fcd0bf8-cf6a-45c0-862b-5554daa34c21\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.488598 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.488630 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.488645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.489304 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.489755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.492242 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.508615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hklmk\" (UniqueName: \"kubernetes.io/projected/6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742-kube-api-access-hklmk\") pod \"heat-operator-controller-manager-69f49c598c-244nk\" (UID: \"6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.508689 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.509498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg7v6\" (UniqueName: \"kubernetes.io/projected/bc6f5234-aab6-43ea-89e1-a3f785742a89-kube-api-access-qg7v6\") pod \"designate-operator-controller-manager-6d8bf5c495-6lrh7\" (UID: \"bc6f5234-aab6-43ea-89e1-a3f785742a89\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgftf\" (UniqueName: \"kubernetes.io/projected/4c759f5c-da54-44e9-8dec-5f2622419af9-kube-api-access-fgftf\") pod \"mariadb-operator-controller-manager-6994f66f48-v6v5m\" (UID: \"4c759f5c-da54-44e9-8dec-5f2622419af9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvdmz\" (UniqueName: \"kubernetes.io/projected/dc8762c9-27f5-476e-840f-815aa3736e85-kube-api-access-hvdmz\") pod \"manila-operator-controller-manager-54f6768c69-c5s9h\" (UID: \"dc8762c9-27f5-476e-840f-815aa3736e85\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96mns\" (UniqueName: \"kubernetes.io/projected/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-kube-api-access-96mns\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtrx\" (UniqueName: \"kubernetes.io/projected/93d80e73-44d0-4db8-8a43-ee2cc8b7e399-kube-api-access-mqtrx\") pod \"ironic-operator-controller-manager-554564d7fc-j5dft\" (UID: \"93d80e73-44d0-4db8-8a43-ee2cc8b7e399\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510811 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq7xf\" (UniqueName: \"kubernetes.io/projected/274d7d14-4ef9-47b8-8a2e-07e7a2bb9850-kube-api-access-xq7xf\") pod \"keystone-operator-controller-manager-b4d948c87-6f96w\" (UID: \"274d7d14-4ef9-47b8-8a2e-07e7a2bb9850\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw86k\" (UniqueName: \"kubernetes.io/projected/1a236879-9c6a-4604-b5bc-024b7dfd5161-kube-api-access-hw86k\") pod \"horizon-operator-controller-manager-5b9b8895d5-85hj5\" (UID: \"1a236879-9c6a-4604-b5bc-024b7dfd5161\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.510872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:51 crc kubenswrapper[4707]: E0218 06:02:51.511044 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:51 crc kubenswrapper[4707]: E0218 06:02:51.511106 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert podName:8ed2f5cf-84b8-4a09-b76f-a60bcb055a04 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:52.011084029 +0000 UTC m=+908.659043163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert") pod "infra-operator-controller-manager-79d975b745-8gngt" (UID: "8ed2f5cf-84b8-4a09-b76f-a60bcb055a04") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.514725 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.519772 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-l8ltf" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.520086 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xd77n" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.520105 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.535086 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zstp5" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.535566 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-2szzr" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.546470 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.548075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.554266 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.555145 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.555283 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hnrrt" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.572908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.588914 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.609259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg7v6\" (UniqueName: \"kubernetes.io/projected/bc6f5234-aab6-43ea-89e1-a3f785742a89-kube-api-access-qg7v6\") pod \"designate-operator-controller-manager-6d8bf5c495-6lrh7\" (UID: \"bc6f5234-aab6-43ea-89e1-a3f785742a89\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq7xf\" (UniqueName: \"kubernetes.io/projected/274d7d14-4ef9-47b8-8a2e-07e7a2bb9850-kube-api-access-xq7xf\") pod \"keystone-operator-controller-manager-b4d948c87-6f96w\" (UID: \"274d7d14-4ef9-47b8-8a2e-07e7a2bb9850\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nft9v\" (UniqueName: \"kubernetes.io/projected/890576c4-79c6-40dc-b786-0fb2055a1a3e-kube-api-access-nft9v\") pod \"octavia-operator-controller-manager-69f8888797-4d59f\" (UID: \"890576c4-79c6-40dc-b786-0fb2055a1a3e\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7vxm\" (UniqueName: \"kubernetes.io/projected/5442f037-ff83-40b8-9c3f-c73c227effde-kube-api-access-g7vxm\") pod \"neutron-operator-controller-manager-64ddbf8bb-2hv82\" (UID: \"5442f037-ff83-40b8-9c3f-c73c227effde\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgftf\" (UniqueName: \"kubernetes.io/projected/4c759f5c-da54-44e9-8dec-5f2622419af9-kube-api-access-fgftf\") pod \"mariadb-operator-controller-manager-6994f66f48-v6v5m\" (UID: \"4c759f5c-da54-44e9-8dec-5f2622419af9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkrln\" (UniqueName: \"kubernetes.io/projected/dc38a034-90cc-4976-93dd-ae54d298b574-kube-api-access-nkrln\") pod \"nova-operator-controller-manager-567668f5cf-twhwz\" (UID: \"dc38a034-90cc-4976-93dd-ae54d298b574\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvdmz\" (UniqueName: \"kubernetes.io/projected/dc8762c9-27f5-476e-840f-815aa3736e85-kube-api-access-hvdmz\") pod \"manila-operator-controller-manager-54f6768c69-c5s9h\" (UID: \"dc8762c9-27f5-476e-840f-815aa3736e85\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s84t\" (UniqueName: \"kubernetes.io/projected/1332f158-2c06-4a3d-9ca9-2dc667c471ba-kube-api-access-6s84t\") pod \"ovn-operator-controller-manager-d44cf6b75-9bb9z\" (UID: \"1332f158-2c06-4a3d-9ca9-2dc667c471ba\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.618752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtrx\" (UniqueName: \"kubernetes.io/projected/93d80e73-44d0-4db8-8a43-ee2cc8b7e399-kube-api-access-mqtrx\") pod \"ironic-operator-controller-manager-554564d7fc-j5dft\" (UID: \"93d80e73-44d0-4db8-8a43-ee2cc8b7e399\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.620158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96mns\" (UniqueName: \"kubernetes.io/projected/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-kube-api-access-96mns\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.621424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw86k\" (UniqueName: \"kubernetes.io/projected/1a236879-9c6a-4604-b5bc-024b7dfd5161-kube-api-access-hw86k\") pod \"horizon-operator-controller-manager-5b9b8895d5-85hj5\" (UID: \"1a236879-9c6a-4604-b5bc-024b7dfd5161\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.651437 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.652842 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.657703 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.670656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq7xf\" (UniqueName: \"kubernetes.io/projected/274d7d14-4ef9-47b8-8a2e-07e7a2bb9850-kube-api-access-xq7xf\") pod \"keystone-operator-controller-manager-b4d948c87-6f96w\" (UID: \"274d7d14-4ef9-47b8-8a2e-07e7a2bb9850\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.672173 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-f6vkt" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.673084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvdmz\" (UniqueName: \"kubernetes.io/projected/dc8762c9-27f5-476e-840f-815aa3736e85-kube-api-access-hvdmz\") pod \"manila-operator-controller-manager-54f6768c69-c5s9h\" (UID: \"dc8762c9-27f5-476e-840f-815aa3736e85\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.688576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.697474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgftf\" (UniqueName: \"kubernetes.io/projected/4c759f5c-da54-44e9-8dec-5f2622419af9-kube-api-access-fgftf\") pod \"mariadb-operator-controller-manager-6994f66f48-v6v5m\" (UID: \"4c759f5c-da54-44e9-8dec-5f2622419af9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.697836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtrx\" (UniqueName: \"kubernetes.io/projected/93d80e73-44d0-4db8-8a43-ee2cc8b7e399-kube-api-access-mqtrx\") pod \"ironic-operator-controller-manager-554564d7fc-j5dft\" (UID: \"93d80e73-44d0-4db8-8a43-ee2cc8b7e399\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.711914 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.726193 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.750267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.750444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkqtb\" (UniqueName: \"kubernetes.io/projected/8078f629-a80e-4f59-b84a-33144cc5b0c6-kube-api-access-hkqtb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.781949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nft9v\" (UniqueName: \"kubernetes.io/projected/890576c4-79c6-40dc-b786-0fb2055a1a3e-kube-api-access-nft9v\") pod \"octavia-operator-controller-manager-69f8888797-4d59f\" (UID: \"890576c4-79c6-40dc-b786-0fb2055a1a3e\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.782291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbfcw\" (UniqueName: \"kubernetes.io/projected/398bbd80-3377-4b8e-b9cd-bdb3a76167ca-kube-api-access-jbfcw\") pod \"placement-operator-controller-manager-8497b45c89-48k7j\" (UID: \"398bbd80-3377-4b8e-b9cd-bdb3a76167ca\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.782316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7vxm\" (UniqueName: \"kubernetes.io/projected/5442f037-ff83-40b8-9c3f-c73c227effde-kube-api-access-g7vxm\") pod \"neutron-operator-controller-manager-64ddbf8bb-2hv82\" (UID: \"5442f037-ff83-40b8-9c3f-c73c227effde\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.782372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkrln\" (UniqueName: \"kubernetes.io/projected/dc38a034-90cc-4976-93dd-ae54d298b574-kube-api-access-nkrln\") pod \"nova-operator-controller-manager-567668f5cf-twhwz\" (UID: \"dc38a034-90cc-4976-93dd-ae54d298b574\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.782432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s84t\" (UniqueName: \"kubernetes.io/projected/1332f158-2c06-4a3d-9ca9-2dc667c471ba-kube-api-access-6s84t\") pod \"ovn-operator-controller-manager-d44cf6b75-9bb9z\" (UID: \"1332f158-2c06-4a3d-9ca9-2dc667c471ba\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.796611 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.797430 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.809989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.881392 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.885006 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.888276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:51 crc kubenswrapper[4707]: E0218 06:02:51.888391 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.887770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s84t\" (UniqueName: \"kubernetes.io/projected/1332f158-2c06-4a3d-9ca9-2dc667c471ba-kube-api-access-6s84t\") pod \"ovn-operator-controller-manager-d44cf6b75-9bb9z\" (UID: \"1332f158-2c06-4a3d-9ca9-2dc667c471ba\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" Feb 18 06:02:51 crc kubenswrapper[4707]: E0218 06:02:51.889482 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert podName:8078f629-a80e-4f59-b84a-33144cc5b0c6 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:52.389435616 +0000 UTC m=+909.037394750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" (UID: "8078f629-a80e-4f59-b84a-33144cc5b0c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.888940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkrln\" (UniqueName: \"kubernetes.io/projected/dc38a034-90cc-4976-93dd-ae54d298b574-kube-api-access-nkrln\") pod \"nova-operator-controller-manager-567668f5cf-twhwz\" (UID: \"dc38a034-90cc-4976-93dd-ae54d298b574\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.889515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkqtb\" (UniqueName: \"kubernetes.io/projected/8078f629-a80e-4f59-b84a-33144cc5b0c6-kube-api-access-hkqtb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.890263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7vxm\" (UniqueName: \"kubernetes.io/projected/5442f037-ff83-40b8-9c3f-c73c227effde-kube-api-access-g7vxm\") pod \"neutron-operator-controller-manager-64ddbf8bb-2hv82\" (UID: \"5442f037-ff83-40b8-9c3f-c73c227effde\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.890727 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.892487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nft9v\" (UniqueName: \"kubernetes.io/projected/890576c4-79c6-40dc-b786-0fb2055a1a3e-kube-api-access-nft9v\") pod \"octavia-operator-controller-manager-69f8888797-4d59f\" (UID: \"890576c4-79c6-40dc-b786-0fb2055a1a3e\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.899444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbfcw\" (UniqueName: \"kubernetes.io/projected/398bbd80-3377-4b8e-b9cd-bdb3a76167ca-kube-api-access-jbfcw\") pod \"placement-operator-controller-manager-8497b45c89-48k7j\" (UID: \"398bbd80-3377-4b8e-b9cd-bdb3a76167ca\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.909171 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.912364 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zp49w" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.922640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbfcw\" (UniqueName: \"kubernetes.io/projected/398bbd80-3377-4b8e-b9cd-bdb3a76167ca-kube-api-access-jbfcw\") pod \"placement-operator-controller-manager-8497b45c89-48k7j\" (UID: \"398bbd80-3377-4b8e-b9cd-bdb3a76167ca\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.937713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkqtb\" (UniqueName: \"kubernetes.io/projected/8078f629-a80e-4f59-b84a-33144cc5b0c6-kube-api-access-hkqtb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.938252 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.943183 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.944189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.952489 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bwghv" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.960071 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.971275 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2dp4d"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.976323 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.985465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.985567 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.993055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.996273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-wj49v" Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.996428 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j"] Feb 18 06:02:51 crc kubenswrapper[4707]: I0218 06:02:51.997394 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.003128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-zz9tl" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.007133 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2dp4d"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.013388 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.018874 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.102714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsqfh\" (UniqueName: \"kubernetes.io/projected/77cee8d8-c1d5-4743-a6c0-478b7c16e991-kube-api-access-wsqfh\") pod \"watcher-operator-controller-manager-5db88f68c-xbl8j\" (UID: \"77cee8d8-c1d5-4743-a6c0-478b7c16e991\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.103124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrwck\" (UniqueName: \"kubernetes.io/projected/d4364aee-09c0-49d9-8f50-60e48ecb7d08-kube-api-access-hrwck\") pod \"test-operator-controller-manager-7866795846-2dp4d\" (UID: \"d4364aee-09c0-49d9-8f50-60e48ecb7d08\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.103175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnbct\" (UniqueName: \"kubernetes.io/projected/7f2692c0-280b-4449-ac2d-6a9da6eafebe-kube-api-access-jnbct\") pod \"telemetry-operator-controller-manager-7f45b4ff68-ln7bk\" (UID: \"7f2692c0-280b-4449-ac2d-6a9da6eafebe\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.103196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7nl\" (UniqueName: \"kubernetes.io/projected/78a96912-db1a-42b8-80aa-7800f28fb0c2-kube-api-access-hm7nl\") pod \"swift-operator-controller-manager-68f46476f-tj2tj\" (UID: \"78a96912-db1a-42b8-80aa-7800f28fb0c2\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.103206 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.103282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.103404 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.103442 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert podName:8ed2f5cf-84b8-4a09-b76f-a60bcb055a04 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:53.103429032 +0000 UTC m=+909.751388156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert") pod "infra-operator-controller-manager-79d975b745-8gngt" (UID: "8ed2f5cf-84b8-4a09-b76f-a60bcb055a04") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.108493 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.113514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.113551 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.113515 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-g97ww" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.134492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.140136 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.204099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.204256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.204340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsqfh\" (UniqueName: \"kubernetes.io/projected/77cee8d8-c1d5-4743-a6c0-478b7c16e991-kube-api-access-wsqfh\") pod \"watcher-operator-controller-manager-5db88f68c-xbl8j\" (UID: \"77cee8d8-c1d5-4743-a6c0-478b7c16e991\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.204387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s76m4\" (UniqueName: \"kubernetes.io/projected/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-kube-api-access-s76m4\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.204448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrwck\" (UniqueName: \"kubernetes.io/projected/d4364aee-09c0-49d9-8f50-60e48ecb7d08-kube-api-access-hrwck\") pod \"test-operator-controller-manager-7866795846-2dp4d\" (UID: \"d4364aee-09c0-49d9-8f50-60e48ecb7d08\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.204496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnbct\" (UniqueName: \"kubernetes.io/projected/7f2692c0-280b-4449-ac2d-6a9da6eafebe-kube-api-access-jnbct\") pod \"telemetry-operator-controller-manager-7f45b4ff68-ln7bk\" (UID: \"7f2692c0-280b-4449-ac2d-6a9da6eafebe\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.204525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7nl\" (UniqueName: \"kubernetes.io/projected/78a96912-db1a-42b8-80aa-7800f28fb0c2-kube-api-access-hm7nl\") pod \"swift-operator-controller-manager-68f46476f-tj2tj\" (UID: \"78a96912-db1a-42b8-80aa-7800f28fb0c2\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.205300 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.206555 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.209049 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-v5czw" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.218119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.226510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsqfh\" (UniqueName: \"kubernetes.io/projected/77cee8d8-c1d5-4743-a6c0-478b7c16e991-kube-api-access-wsqfh\") pod \"watcher-operator-controller-manager-5db88f68c-xbl8j\" (UID: \"77cee8d8-c1d5-4743-a6c0-478b7c16e991\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.234609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7nl\" (UniqueName: \"kubernetes.io/projected/78a96912-db1a-42b8-80aa-7800f28fb0c2-kube-api-access-hm7nl\") pod \"swift-operator-controller-manager-68f46476f-tj2tj\" (UID: \"78a96912-db1a-42b8-80aa-7800f28fb0c2\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.235372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnbct\" (UniqueName: \"kubernetes.io/projected/7f2692c0-280b-4449-ac2d-6a9da6eafebe-kube-api-access-jnbct\") pod \"telemetry-operator-controller-manager-7f45b4ff68-ln7bk\" (UID: \"7f2692c0-280b-4449-ac2d-6a9da6eafebe\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.235595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrwck\" (UniqueName: \"kubernetes.io/projected/d4364aee-09c0-49d9-8f50-60e48ecb7d08-kube-api-access-hrwck\") pod \"test-operator-controller-manager-7866795846-2dp4d\" (UID: \"d4364aee-09c0-49d9-8f50-60e48ecb7d08\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.241435 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.255236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.267076 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-fnz67"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.294376 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.305384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.305463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s76m4\" (UniqueName: \"kubernetes.io/projected/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-kube-api-access-s76m4\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.305497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplbn\" (UniqueName: \"kubernetes.io/projected/97e7c996-241f-4732-9e68-a371d114f664-kube-api-access-hplbn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4qmcg\" (UID: \"97e7c996-241f-4732-9e68-a371d114f664\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.305573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.305720 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.305776 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:52.805758194 +0000 UTC m=+909.453717328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "metrics-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.306064 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.306095 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:52.806085513 +0000 UTC m=+909.454044647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.328427 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.334868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s76m4\" (UniqueName: \"kubernetes.io/projected/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-kube-api-access-s76m4\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.342668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.406329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.406411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplbn\" (UniqueName: \"kubernetes.io/projected/97e7c996-241f-4732-9e68-a371d114f664-kube-api-access-hplbn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4qmcg\" (UID: \"97e7c996-241f-4732-9e68-a371d114f664\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.406779 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.406837 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert podName:8078f629-a80e-4f59-b84a-33144cc5b0c6 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:53.406824347 +0000 UTC m=+910.054783481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" (UID: "8078f629-a80e-4f59-b84a-33144cc5b0c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.446284 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-244nk"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.467287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplbn\" (UniqueName: \"kubernetes.io/projected/97e7c996-241f-4732-9e68-a371d114f664-kube-api-access-hplbn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4qmcg\" (UID: \"97e7c996-241f-4732-9e68-a371d114f664\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.590169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.632014 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.635535 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.641319 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w"] Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.813306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.813411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.813490 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.813563 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:53.813544076 +0000 UTC m=+910.461503200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "metrics-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.813591 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: E0218 06:02:52.813643 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:53.813627308 +0000 UTC m=+910.461586442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "webhook-server-cert" not found Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.828822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" event={"ID":"4fcd0bf8-cf6a-45c0-862b-5554daa34c21","Type":"ContainerStarted","Data":"bce7183d655dcc1ef11b4bf35daaa615a990e9dd0bf0bbc63511656fbe208d71"} Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.830769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" event={"ID":"8f61ada5-7374-4801-89b2-c95aec2e52ab","Type":"ContainerStarted","Data":"1e19cbb5bbdc689ce89ab32f08cdb77b00cee28076fc8789c793a8fd6f5665f1"} Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.831951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" event={"ID":"bc6f5234-aab6-43ea-89e1-a3f785742a89","Type":"ContainerStarted","Data":"3ef833bbb3ab954ebef444b9ef742cd55dcfe05694e26a3b2b860f1abb7b646d"} Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.833011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" event={"ID":"274d7d14-4ef9-47b8-8a2e-07e7a2bb9850","Type":"ContainerStarted","Data":"5c8aa150be51ed8f975efcadb3822fe20ac5112f03621ef93531900747d35bfd"} Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.834505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" event={"ID":"6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742","Type":"ContainerStarted","Data":"097a477a1928a859e2e65b67fa8b284c4aaf403ec09a6810e624bae6cc0009b8"} Feb 18 06:02:52 crc kubenswrapper[4707]: I0218 06:02:52.872180 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5"] Feb 18 06:02:52 crc kubenswrapper[4707]: W0218 06:02:52.872560 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a236879_9c6a_4604_b5bc_024b7dfd5161.slice/crio-ac069783c7262242b57f9896dedd4e7bb045b7983f5c0df24c48e7d0801e8782 WatchSource:0}: Error finding container ac069783c7262242b57f9896dedd4e7bb045b7983f5c0df24c48e7d0801e8782: Status 404 returned error can't find the container with id ac069783c7262242b57f9896dedd4e7bb045b7983f5c0df24c48e7d0801e8782 Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.071009 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.077856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.089590 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.096400 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h"] Feb 18 06:02:53 crc kubenswrapper[4707]: W0218 06:02:53.097225 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc38a034_90cc_4976_93dd_ae54d298b574.slice/crio-64c8c4763bd0c3c28ef7b5406071b072c0e956c38cf0093276cdf7d210afd0cd WatchSource:0}: Error finding container 64c8c4763bd0c3c28ef7b5406071b072c0e956c38cf0093276cdf7d210afd0cd: Status 404 returned error can't find the container with id 64c8c4763bd0c3c28ef7b5406071b072c0e956c38cf0093276cdf7d210afd0cd Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.102847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82"] Feb 18 06:02:53 crc kubenswrapper[4707]: W0218 06:02:53.106631 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5442f037_ff83_40b8_9c3f_c73c227effde.slice/crio-ba6828b7001a535dd41f5d742dfe4d311b9947609ebf020bc0c622d9280e2580 WatchSource:0}: Error finding container ba6828b7001a535dd41f5d742dfe4d311b9947609ebf020bc0c622d9280e2580: Status 404 returned error can't find the container with id ba6828b7001a535dd41f5d742dfe4d311b9947609ebf020bc0c622d9280e2580 Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.108781 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.117395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.117640 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.117701 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert podName:8ed2f5cf-84b8-4a09-b76f-a60bcb055a04 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:55.117687122 +0000 UTC m=+911.765646256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert") pod "infra-operator-controller-manager-79d975b745-8gngt" (UID: "8ed2f5cf-84b8-4a09-b76f-a60bcb055a04") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.269250 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.295048 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj"] Feb 18 06:02:53 crc kubenswrapper[4707]: W0218 06:02:53.299645 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890576c4_79c6_40dc_b786_0fb2055a1a3e.slice/crio-335ba5daf26ea07d3c0e261644ee771016a3ce8df9bf8554b7ef9f9d6f25406e WatchSource:0}: Error finding container 335ba5daf26ea07d3c0e261644ee771016a3ce8df9bf8554b7ef9f9d6f25406e: Status 404 returned error can't find the container with id 335ba5daf26ea07d3c0e261644ee771016a3ce8df9bf8554b7ef9f9d6f25406e Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.306031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.311319 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.316349 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f"] Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.316644 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nft9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-4d59f_openstack-operators(890576c4-79c6-40dc-b786-0fb2055a1a3e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.317816 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" podUID="890576c4-79c6-40dc-b786-0fb2055a1a3e" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.324174 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j"] Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.347991 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hm7nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-tj2tj_openstack-operators(78a96912-db1a-42b8-80aa-7800f28fb0c2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.349973 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" podUID="78a96912-db1a-42b8-80aa-7800f28fb0c2" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.354583 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jnbct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-ln7bk_openstack-operators(7f2692c0-280b-4449-ac2d-6a9da6eafebe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.355872 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" podUID="7f2692c0-280b-4449-ac2d-6a9da6eafebe" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.356571 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsqfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-xbl8j_openstack-operators(77cee8d8-c1d5-4743-a6c0-478b7c16e991): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.357872 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" podUID="77cee8d8-c1d5-4743-a6c0-478b7c16e991" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.423292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.423636 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.423739 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert podName:8078f629-a80e-4f59-b84a-33144cc5b0c6 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:55.423722268 +0000 UTC m=+912.071681402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" (UID: "8078f629-a80e-4f59-b84a-33144cc5b0c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.446564 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg"] Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.451615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2dp4d"] Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.475204 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hplbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-4qmcg_openstack-operators(97e7c996-241f-4732-9e68-a371d114f664): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.476729 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" podUID="97e7c996-241f-4732-9e68-a371d114f664" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.830569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.830643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.830788 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.830847 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:55.830832918 +0000 UTC m=+912.478792052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "webhook-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.831546 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.831628 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:55.831610019 +0000 UTC m=+912.479569153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "metrics-server-cert" not found Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.855376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" event={"ID":"67bdd3cc-ee7d-4e79-8568-75502788aa1d","Type":"ContainerStarted","Data":"cf6ea418f6cd541865d88969c62b0be1d465d86c2cd7eb2f1272330966acbb01"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.859204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" event={"ID":"1332f158-2c06-4a3d-9ca9-2dc667c471ba","Type":"ContainerStarted","Data":"fd89468f5bd755f5825c127c55c8dce05c3ca4c981c68c62b0557583a73b44f6"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.862414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" event={"ID":"93d80e73-44d0-4db8-8a43-ee2cc8b7e399","Type":"ContainerStarted","Data":"21d3d8b91c74d022da5570cbe0ac18ea62a209e01fa0f9d74a3ac98e73f8dd0a"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.872637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" event={"ID":"5442f037-ff83-40b8-9c3f-c73c227effde","Type":"ContainerStarted","Data":"ba6828b7001a535dd41f5d742dfe4d311b9947609ebf020bc0c622d9280e2580"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.874887 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" event={"ID":"dc8762c9-27f5-476e-840f-815aa3736e85","Type":"ContainerStarted","Data":"2b3c8fa1f43864119646d5d15986a38f0ae9f7f490d922006ef10616f04bc4fc"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.877248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" event={"ID":"97e7c996-241f-4732-9e68-a371d114f664","Type":"ContainerStarted","Data":"925cb3efe973bc90fe47eabcd85aab73d3a64fd7efc8634fdabc2121a415901f"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.879743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" event={"ID":"dc38a034-90cc-4976-93dd-ae54d298b574","Type":"ContainerStarted","Data":"64c8c4763bd0c3c28ef7b5406071b072c0e956c38cf0093276cdf7d210afd0cd"} Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.881069 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" podUID="97e7c996-241f-4732-9e68-a371d114f664" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.883170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" event={"ID":"77cee8d8-c1d5-4743-a6c0-478b7c16e991","Type":"ContainerStarted","Data":"c7bf96e06b18ad698a6133998508ee47161f807cf4b1ef27b7cf1f6b9bd10a4a"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.884712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" event={"ID":"d4364aee-09c0-49d9-8f50-60e48ecb7d08","Type":"ContainerStarted","Data":"74c57639090992dba2e38028d4e63dbb5ca644dee6fdb672c0d9f1568d514926"} Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.885001 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" podUID="77cee8d8-c1d5-4743-a6c0-478b7c16e991" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.889901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" event={"ID":"398bbd80-3377-4b8e-b9cd-bdb3a76167ca","Type":"ContainerStarted","Data":"b4793df56b885d4ac8ca1f4b2d9a6adc5d587b3abadcc68e2fd44232f92a45ec"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.892907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" event={"ID":"78a96912-db1a-42b8-80aa-7800f28fb0c2","Type":"ContainerStarted","Data":"efcc2feda40b2b42980d54091dc2e270b39aa966952a29034ed6e271d8d64e26"} Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.901164 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" podUID="78a96912-db1a-42b8-80aa-7800f28fb0c2" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.903823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" event={"ID":"890576c4-79c6-40dc-b786-0fb2055a1a3e","Type":"ContainerStarted","Data":"335ba5daf26ea07d3c0e261644ee771016a3ce8df9bf8554b7ef9f9d6f25406e"} Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.905172 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" podUID="890576c4-79c6-40dc-b786-0fb2055a1a3e" Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.907958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" event={"ID":"4c759f5c-da54-44e9-8dec-5f2622419af9","Type":"ContainerStarted","Data":"084c7752a3ba4a5033e06a5fbbb37114667b5fc656d1042f0a413977c32cc023"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.909659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" event={"ID":"1a236879-9c6a-4604-b5bc-024b7dfd5161","Type":"ContainerStarted","Data":"ac069783c7262242b57f9896dedd4e7bb045b7983f5c0df24c48e7d0801e8782"} Feb 18 06:02:53 crc kubenswrapper[4707]: I0218 06:02:53.910994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" event={"ID":"7f2692c0-280b-4449-ac2d-6a9da6eafebe","Type":"ContainerStarted","Data":"32e20d5a909c4b475a39837aae0cb3e1726803636046ef8399e16041cac459ae"} Feb 18 06:02:53 crc kubenswrapper[4707]: E0218 06:02:53.912843 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" podUID="7f2692c0-280b-4449-ac2d-6a9da6eafebe" Feb 18 06:02:54 crc kubenswrapper[4707]: E0218 06:02:54.958612 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" podUID="7f2692c0-280b-4449-ac2d-6a9da6eafebe" Feb 18 06:02:54 crc kubenswrapper[4707]: E0218 06:02:54.958585 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" podUID="77cee8d8-c1d5-4743-a6c0-478b7c16e991" Feb 18 06:02:54 crc kubenswrapper[4707]: E0218 06:02:54.959018 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" podUID="78a96912-db1a-42b8-80aa-7800f28fb0c2" Feb 18 06:02:54 crc kubenswrapper[4707]: E0218 06:02:54.959077 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" podUID="890576c4-79c6-40dc-b786-0fb2055a1a3e" Feb 18 06:02:54 crc kubenswrapper[4707]: E0218 06:02:54.959206 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" podUID="97e7c996-241f-4732-9e68-a371d114f664" Feb 18 06:02:55 crc kubenswrapper[4707]: I0218 06:02:55.165647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.165774 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.165859 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert podName:8ed2f5cf-84b8-4a09-b76f-a60bcb055a04 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:59.165839993 +0000 UTC m=+915.813799127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert") pod "infra-operator-controller-manager-79d975b745-8gngt" (UID: "8ed2f5cf-84b8-4a09-b76f-a60bcb055a04") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:55 crc kubenswrapper[4707]: I0218 06:02:55.472169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.472341 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.472387 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert podName:8078f629-a80e-4f59-b84a-33144cc5b0c6 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:59.472371815 +0000 UTC m=+916.120330949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" (UID: "8078f629-a80e-4f59-b84a-33144cc5b0c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:55 crc kubenswrapper[4707]: I0218 06:02:55.877365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:55 crc kubenswrapper[4707]: I0218 06:02:55.877457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.877524 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.877603 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:59.877585244 +0000 UTC m=+916.525544378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "metrics-server-cert" not found Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.877655 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:02:55 crc kubenswrapper[4707]: E0218 06:02:55.877730 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:02:59.877712528 +0000 UTC m=+916.525671662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "webhook-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: I0218 06:02:59.242313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.242484 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.242766 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert podName:8ed2f5cf-84b8-4a09-b76f-a60bcb055a04 nodeName:}" failed. No retries permitted until 2026-02-18 06:03:07.242749275 +0000 UTC m=+923.890708399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert") pod "infra-operator-controller-manager-79d975b745-8gngt" (UID: "8ed2f5cf-84b8-4a09-b76f-a60bcb055a04") : secret "infra-operator-webhook-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: I0218 06:02:59.546121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.546334 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.546414 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert podName:8078f629-a80e-4f59-b84a-33144cc5b0c6 nodeName:}" failed. No retries permitted until 2026-02-18 06:03:07.54639629 +0000 UTC m=+924.194355424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" (UID: "8078f629-a80e-4f59-b84a-33144cc5b0c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: I0218 06:02:59.950486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:59 crc kubenswrapper[4707]: I0218 06:02:59.950569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.950737 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.950776 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.950787 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:03:07.950771437 +0000 UTC m=+924.598730561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "webhook-server-cert" not found Feb 18 06:02:59 crc kubenswrapper[4707]: E0218 06:02:59.950938 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:03:07.95091447 +0000 UTC m=+924.598873704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "metrics-server-cert" not found Feb 18 06:03:06 crc kubenswrapper[4707]: E0218 06:03:06.837410 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642" Feb 18 06:03:06 crc kubenswrapper[4707]: E0218 06:03:06.838279 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qg7v6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d8bf5c495-6lrh7_openstack-operators(bc6f5234-aab6-43ea-89e1-a3f785742a89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:03:06 crc kubenswrapper[4707]: E0218 06:03:06.839479 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" podUID="bc6f5234-aab6-43ea-89e1-a3f785742a89" Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.075384 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" podUID="bc6f5234-aab6-43ea-89e1-a3f785742a89" Feb 18 06:03:07 crc kubenswrapper[4707]: I0218 06:03:07.251543 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:03:07 crc kubenswrapper[4707]: I0218 06:03:07.258061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ed2f5cf-84b8-4a09-b76f-a60bcb055a04-cert\") pod \"infra-operator-controller-manager-79d975b745-8gngt\" (UID: \"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:03:07 crc kubenswrapper[4707]: I0218 06:03:07.268990 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.399698 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.399937 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jbfcw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-48k7j_openstack-operators(398bbd80-3377-4b8e-b9cd-bdb3a76167ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.401128 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" podUID="398bbd80-3377-4b8e-b9cd-bdb3a76167ca" Feb 18 06:03:07 crc kubenswrapper[4707]: I0218 06:03:07.556319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.556524 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.556619 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert podName:8078f629-a80e-4f59-b84a-33144cc5b0c6 nodeName:}" failed. No retries permitted until 2026-02-18 06:03:23.556599885 +0000 UTC m=+940.204559019 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" (UID: "8078f629-a80e-4f59-b84a-33144cc5b0c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 06:03:07 crc kubenswrapper[4707]: I0218 06:03:07.960991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:07 crc kubenswrapper[4707]: I0218 06:03:07.961069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.961164 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.961266 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:03:23.961220259 +0000 UTC m=+940.609179383 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "metrics-server-cert" not found Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.962150 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 06:03:07 crc kubenswrapper[4707]: E0218 06:03:07.962355 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs podName:c8f9f5f4-3cdb-4b04-bc52-26acb4dda227 nodeName:}" failed. No retries permitted until 2026-02-18 06:03:23.962211785 +0000 UTC m=+940.610170999 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs") pod "openstack-operator-controller-manager-dccc9b448-chjxf" (UID: "c8f9f5f4-3cdb-4b04-bc52-26acb4dda227") : secret "webhook-server-cert" not found Feb 18 06:03:08 crc kubenswrapper[4707]: E0218 06:03:08.079652 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" podUID="398bbd80-3377-4b8e-b9cd-bdb3a76167ca" Feb 18 06:03:08 crc kubenswrapper[4707]: E0218 06:03:08.087730 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 18 06:03:08 crc kubenswrapper[4707]: E0218 06:03:08.087919 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7vxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-2hv82_openstack-operators(5442f037-ff83-40b8-9c3f-c73c227effde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:03:08 crc kubenswrapper[4707]: E0218 06:03:08.089439 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" podUID="5442f037-ff83-40b8-9c3f-c73c227effde" Feb 18 06:03:08 crc kubenswrapper[4707]: E0218 06:03:08.804059 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 18 06:03:08 crc kubenswrapper[4707]: E0218 06:03:08.804221 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nkrln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-twhwz_openstack-operators(dc38a034-90cc-4976-93dd-ae54d298b574): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:03:08 crc kubenswrapper[4707]: E0218 06:03:08.805507 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" podUID="dc38a034-90cc-4976-93dd-ae54d298b574" Feb 18 06:03:09 crc kubenswrapper[4707]: E0218 06:03:09.086014 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" podUID="5442f037-ff83-40b8-9c3f-c73c227effde" Feb 18 06:03:09 crc kubenswrapper[4707]: E0218 06:03:09.092565 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" podUID="dc38a034-90cc-4976-93dd-ae54d298b574" Feb 18 06:03:09 crc kubenswrapper[4707]: E0218 06:03:09.279760 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 18 06:03:09 crc kubenswrapper[4707]: E0218 06:03:09.279959 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xq7xf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-6f96w_openstack-operators(274d7d14-4ef9-47b8-8a2e-07e7a2bb9850): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:03:09 crc kubenswrapper[4707]: E0218 06:03:09.281091 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" podUID="274d7d14-4ef9-47b8-8a2e-07e7a2bb9850" Feb 18 06:03:09 crc kubenswrapper[4707]: I0218 06:03:09.454461 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-8gngt"] Feb 18 06:03:09 crc kubenswrapper[4707]: W0218 06:03:09.497671 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ed2f5cf_84b8_4a09_b76f_a60bcb055a04.slice/crio-8b4974ec10f6d5aef241c5d794021184c8d27fe1bdc2ed619afc4da575f459e3 WatchSource:0}: Error finding container 8b4974ec10f6d5aef241c5d794021184c8d27fe1bdc2ed619afc4da575f459e3: Status 404 returned error can't find the container with id 8b4974ec10f6d5aef241c5d794021184c8d27fe1bdc2ed619afc4da575f459e3 Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.108985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" event={"ID":"4fcd0bf8-cf6a-45c0-862b-5554daa34c21","Type":"ContainerStarted","Data":"b5172d65901cc6a15f6bf20dea4202cff3f826c5ba59168b578de1b22cf68486"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.109656 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.127314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" event={"ID":"d4364aee-09c0-49d9-8f50-60e48ecb7d08","Type":"ContainerStarted","Data":"2eb7f524f765b6be9ad516c7c52b23bc3a0c17f0812432ac3d068d93ddc5ee7f"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.127453 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.140359 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" podStartSLOduration=2.604261977 podStartE2EDuration="19.140339955s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:52.72352489 +0000 UTC m=+909.371484024" lastFinishedPulling="2026-02-18 06:03:09.259602858 +0000 UTC m=+925.907562002" observedRunningTime="2026-02-18 06:03:10.136678688 +0000 UTC m=+926.784637812" watchObservedRunningTime="2026-02-18 06:03:10.140339955 +0000 UTC m=+926.788299089" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.159200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" event={"ID":"8f61ada5-7374-4801-89b2-c95aec2e52ab","Type":"ContainerStarted","Data":"65a7166d162a684500266ca434cb322fe6bdeda40d2d4b3ffeca15242e5721be"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.159909 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.161727 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" podStartSLOduration=3.375838259 podStartE2EDuration="19.161715507s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.474527052 +0000 UTC m=+910.122486186" lastFinishedPulling="2026-02-18 06:03:09.26040428 +0000 UTC m=+925.908363434" observedRunningTime="2026-02-18 06:03:10.160209677 +0000 UTC m=+926.808168811" watchObservedRunningTime="2026-02-18 06:03:10.161715507 +0000 UTC m=+926.809674641" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.169947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" event={"ID":"67bdd3cc-ee7d-4e79-8568-75502788aa1d","Type":"ContainerStarted","Data":"54ee25b81eba0f3a9481fce43795e7b815d8cc23c639e18206f5ea00dc726393"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.170551 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.172594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" event={"ID":"1332f158-2c06-4a3d-9ca9-2dc667c471ba","Type":"ContainerStarted","Data":"cb62d881a41152e90bf67578a94e15b4ca28b953d1c8c951e6e72006ec4ae40b"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.173378 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.175541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" event={"ID":"dc8762c9-27f5-476e-840f-815aa3736e85","Type":"ContainerStarted","Data":"f9c03256e14581e209d90e54b0fad6b0b45ce4e875c349c6b81bd13f5fbc1fc2"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.176019 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.189655 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" podStartSLOduration=2.247168963 podStartE2EDuration="19.189635813s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:52.312402652 +0000 UTC m=+908.960361786" lastFinishedPulling="2026-02-18 06:03:09.254869502 +0000 UTC m=+925.902828636" observedRunningTime="2026-02-18 06:03:10.188188834 +0000 UTC m=+926.836147968" watchObservedRunningTime="2026-02-18 06:03:10.189635813 +0000 UTC m=+926.837594937" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.206444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" event={"ID":"93d80e73-44d0-4db8-8a43-ee2cc8b7e399","Type":"ContainerStarted","Data":"6b27bc008bd18e5945f6bff371dcfedb0da7fe04914f05f67f8a06e029c2d2d8"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.211577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" event={"ID":"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04","Type":"ContainerStarted","Data":"8b4974ec10f6d5aef241c5d794021184c8d27fe1bdc2ed619afc4da575f459e3"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.215258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" event={"ID":"6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742","Type":"ContainerStarted","Data":"88387505ae0aab1f9dd35a04178c38de1427ec4443daec079fc873ed7599d5e5"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.215392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.219974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" event={"ID":"4c759f5c-da54-44e9-8dec-5f2622419af9","Type":"ContainerStarted","Data":"1770a2cbf5797c24f3fcab3aed0e5fa86bd6773e860b8418b4feb015c4b39eb2"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.220425 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" podStartSLOduration=3.263998068 podStartE2EDuration="19.220405085s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.29940017 +0000 UTC m=+909.947359304" lastFinishedPulling="2026-02-18 06:03:09.255807167 +0000 UTC m=+925.903766321" observedRunningTime="2026-02-18 06:03:10.214911018 +0000 UTC m=+926.862870162" watchObservedRunningTime="2026-02-18 06:03:10.220405085 +0000 UTC m=+926.868364219" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.220579 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" Feb 18 06:03:10 crc kubenswrapper[4707]: E0218 06:03:10.230634 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" podUID="274d7d14-4ef9-47b8-8a2e-07e7a2bb9850" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.233840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" event={"ID":"1a236879-9c6a-4604-b5bc-024b7dfd5161","Type":"ContainerStarted","Data":"9d63aa86d0d60d342f059e4f63752727e9d185e908c31a87538ab939c1b8f251"} Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.233924 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.254345 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" podStartSLOduration=3.098409239 podStartE2EDuration="19.254328542s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.098959619 +0000 UTC m=+909.746918753" lastFinishedPulling="2026-02-18 06:03:09.254878922 +0000 UTC m=+925.902838056" observedRunningTime="2026-02-18 06:03:10.251722612 +0000 UTC m=+926.899681746" watchObservedRunningTime="2026-02-18 06:03:10.254328542 +0000 UTC m=+926.902287676" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.286300 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" podStartSLOduration=3.146984941 podStartE2EDuration="19.286280486s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.118931596 +0000 UTC m=+909.766890720" lastFinishedPulling="2026-02-18 06:03:09.258227131 +0000 UTC m=+925.906186265" observedRunningTime="2026-02-18 06:03:10.280033118 +0000 UTC m=+926.927992252" watchObservedRunningTime="2026-02-18 06:03:10.286280486 +0000 UTC m=+926.934239620" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.305351 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" podStartSLOduration=2.926083616 podStartE2EDuration="19.305337254s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:52.87547011 +0000 UTC m=+909.523429244" lastFinishedPulling="2026-02-18 06:03:09.254723758 +0000 UTC m=+925.902682882" observedRunningTime="2026-02-18 06:03:10.301161033 +0000 UTC m=+926.949120167" watchObservedRunningTime="2026-02-18 06:03:10.305337254 +0000 UTC m=+926.953296388" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.320567 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" podStartSLOduration=3.184554862 podStartE2EDuration="19.320552771s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.118905234 +0000 UTC m=+909.766864368" lastFinishedPulling="2026-02-18 06:03:09.254903143 +0000 UTC m=+925.902862277" observedRunningTime="2026-02-18 06:03:10.319664827 +0000 UTC m=+926.967623961" watchObservedRunningTime="2026-02-18 06:03:10.320552771 +0000 UTC m=+926.968511905" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.373647 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" podStartSLOduration=2.7147609150000003 podStartE2EDuration="19.3736256s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:52.595865203 +0000 UTC m=+909.243824327" lastFinishedPulling="2026-02-18 06:03:09.254729868 +0000 UTC m=+925.902689012" observedRunningTime="2026-02-18 06:03:10.372252323 +0000 UTC m=+927.020211477" watchObservedRunningTime="2026-02-18 06:03:10.3736256 +0000 UTC m=+927.021584734" Feb 18 06:03:10 crc kubenswrapper[4707]: I0218 06:03:10.374223 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" podStartSLOduration=3.233265724 podStartE2EDuration="19.374217775s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.113987023 +0000 UTC m=+909.761946157" lastFinishedPulling="2026-02-18 06:03:09.254939064 +0000 UTC m=+925.902898208" observedRunningTime="2026-02-18 06:03:10.347006768 +0000 UTC m=+926.994965902" watchObservedRunningTime="2026-02-18 06:03:10.374217775 +0000 UTC m=+927.022176909" Feb 18 06:03:11 crc kubenswrapper[4707]: I0218 06:03:11.244255 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.282273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" event={"ID":"77cee8d8-c1d5-4743-a6c0-478b7c16e991","Type":"ContainerStarted","Data":"a5e217b63609536acbb12efc83efee9dfac1bc2d5994350fc690c4e2a7d1bda4"} Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.283045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.283543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" event={"ID":"8ed2f5cf-84b8-4a09-b76f-a60bcb055a04","Type":"ContainerStarted","Data":"e9464945fd475a91d65088a6baab81d11ef70fcce208a56c17f11fa787fb10bd"} Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.283608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.284676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" event={"ID":"97e7c996-241f-4732-9e68-a371d114f664","Type":"ContainerStarted","Data":"856a37ddeb93fdf34dab88f26191ae85c4d5283ec3cbb8746bb5951523cf0885"} Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.285916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" event={"ID":"7f2692c0-280b-4449-ac2d-6a9da6eafebe","Type":"ContainerStarted","Data":"bbd53f323cc8f2a638246a4a434dc416bb00a8a532a533d080fb4aa3c78e776b"} Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.286123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.287065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" event={"ID":"78a96912-db1a-42b8-80aa-7800f28fb0c2","Type":"ContainerStarted","Data":"cbdc8ffef5bbf7292e03df1a012715d5d25e1d9453a793bdecb7ca4efc8db42f"} Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.287310 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.288159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" event={"ID":"890576c4-79c6-40dc-b786-0fb2055a1a3e","Type":"ContainerStarted","Data":"0a2feaccc3d3f707353ece281d17372a76a03ed8fba6a401e0e30d19e959f695"} Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.288347 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.300507 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" podStartSLOduration=3.413716586 podStartE2EDuration="26.300491744s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.356335359 +0000 UTC m=+910.004294493" lastFinishedPulling="2026-02-18 06:03:16.243110477 +0000 UTC m=+932.891069651" observedRunningTime="2026-02-18 06:03:17.297509085 +0000 UTC m=+933.945468229" watchObservedRunningTime="2026-02-18 06:03:17.300491744 +0000 UTC m=+933.948450878" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.312635 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4qmcg" podStartSLOduration=2.590644181 podStartE2EDuration="25.312620189s" podCreationTimestamp="2026-02-18 06:02:52 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.475093808 +0000 UTC m=+910.123052942" lastFinishedPulling="2026-02-18 06:03:16.197069816 +0000 UTC m=+932.845028950" observedRunningTime="2026-02-18 06:03:17.309335231 +0000 UTC m=+933.957294395" watchObservedRunningTime="2026-02-18 06:03:17.312620189 +0000 UTC m=+933.960579323" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.329474 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" podStartSLOduration=3.452848969 podStartE2EDuration="26.329456159s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.316467479 +0000 UTC m=+909.964426613" lastFinishedPulling="2026-02-18 06:03:16.193074669 +0000 UTC m=+932.841033803" observedRunningTime="2026-02-18 06:03:17.324088465 +0000 UTC m=+933.972047599" watchObservedRunningTime="2026-02-18 06:03:17.329456159 +0000 UTC m=+933.977415293" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.343464 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" podStartSLOduration=3.498249694 podStartE2EDuration="26.343446842s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.347887492 +0000 UTC m=+909.995846626" lastFinishedPulling="2026-02-18 06:03:16.19308464 +0000 UTC m=+932.841043774" observedRunningTime="2026-02-18 06:03:17.339014234 +0000 UTC m=+933.986973378" watchObservedRunningTime="2026-02-18 06:03:17.343446842 +0000 UTC m=+933.991405976" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.359093 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" podStartSLOduration=3.455872032 podStartE2EDuration="26.35907088s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.354012936 +0000 UTC m=+910.001972070" lastFinishedPulling="2026-02-18 06:03:16.257211744 +0000 UTC m=+932.905170918" observedRunningTime="2026-02-18 06:03:17.354762425 +0000 UTC m=+934.002721559" watchObservedRunningTime="2026-02-18 06:03:17.35907088 +0000 UTC m=+934.007030004" Feb 18 06:03:17 crc kubenswrapper[4707]: I0218 06:03:17.375524 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" podStartSLOduration=19.609340399 podStartE2EDuration="26.375497349s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:03:09.506744263 +0000 UTC m=+926.154703397" lastFinishedPulling="2026-02-18 06:03:16.272901213 +0000 UTC m=+932.920860347" observedRunningTime="2026-02-18 06:03:17.371784129 +0000 UTC m=+934.019743263" watchObservedRunningTime="2026-02-18 06:03:17.375497349 +0000 UTC m=+934.023456483" Feb 18 06:03:20 crc kubenswrapper[4707]: I0218 06:03:20.307072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" event={"ID":"bc6f5234-aab6-43ea-89e1-a3f785742a89","Type":"ContainerStarted","Data":"ab23f8d2b1db60db12ae987f45fe073e78df698b380dcd22ccab9e24365f1ab5"} Feb 18 06:03:20 crc kubenswrapper[4707]: I0218 06:03:20.307686 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" Feb 18 06:03:20 crc kubenswrapper[4707]: I0218 06:03:20.324963 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" podStartSLOduration=2.204362637 podStartE2EDuration="29.3249435s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:52.697955954 +0000 UTC m=+909.345915088" lastFinishedPulling="2026-02-18 06:03:19.818536817 +0000 UTC m=+936.466495951" observedRunningTime="2026-02-18 06:03:20.32082987 +0000 UTC m=+936.968789004" watchObservedRunningTime="2026-02-18 06:03:20.3249435 +0000 UTC m=+936.972902634" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.319647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" event={"ID":"5442f037-ff83-40b8-9c3f-c73c227effde","Type":"ContainerStarted","Data":"9b2bce2b4ae645d2f684fd277ed87293570c2ff58e6bca4ca0ad6e2d83ba5bf5"} Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.320194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.382425 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.382524 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.382588 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.383520 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c49c62c491f92ccee8fba7d855c10e4bbd43134f1f4f54ad1885c2004d7c90c"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.383664 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://9c49c62c491f92ccee8fba7d855c10e4bbd43134f1f4f54ad1885c2004d7c90c" gracePeriod=600 Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.467507 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-fnz67" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.501894 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" podStartSLOduration=3.166861459 podStartE2EDuration="30.501873093s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.109027579 +0000 UTC m=+909.756986713" lastFinishedPulling="2026-02-18 06:03:20.444039213 +0000 UTC m=+937.091998347" observedRunningTime="2026-02-18 06:03:21.34835376 +0000 UTC m=+937.996312894" watchObservedRunningTime="2026-02-18 06:03:21.501873093 +0000 UTC m=+938.149832237" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.559600 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-244nk" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.661494 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-85hj5" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.800142 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-qj27r" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.800389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c5s9h" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.815895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnhbn" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.885517 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-v6v5m" Feb 18 06:03:21 crc kubenswrapper[4707]: I0218 06:03:21.997523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-j5dft" Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.016738 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4d59f" Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.136971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-9bb9z" Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.259000 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-tj2tj" Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.301477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-ln7bk" Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.337430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-2dp4d" Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.338945 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="9c49c62c491f92ccee8fba7d855c10e4bbd43134f1f4f54ad1885c2004d7c90c" exitCode=0 Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.339930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"9c49c62c491f92ccee8fba7d855c10e4bbd43134f1f4f54ad1885c2004d7c90c"} Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.339964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"27b00527a1a2dd19572cd4b34eeb62edfbb26a8ff621d8e0ad7b7a217cf69cd3"} Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.339983 4707 scope.go:117] "RemoveContainer" containerID="e32285faa40540ae5e3f1a855f8e56122b182016a9d9b345bafd252cfac4ced1" Feb 18 06:03:22 crc kubenswrapper[4707]: I0218 06:03:22.345715 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xbl8j" Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.349268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" event={"ID":"398bbd80-3377-4b8e-b9cd-bdb3a76167ca","Type":"ContainerStarted","Data":"eddcc24c78eecd885b47f9fa46f1fea33f85127bf23e00bcfd903be52f7bcc6a"} Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.350051 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.353466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" event={"ID":"274d7d14-4ef9-47b8-8a2e-07e7a2bb9850","Type":"ContainerStarted","Data":"6e52fe14d858479c3c1cb6f7246bd96ae3416fe3ea02677d5864043538feeab6"} Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.353666 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.369775 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" podStartSLOduration=3.178400024 podStartE2EDuration="32.36975288s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.316468269 +0000 UTC m=+909.964427403" lastFinishedPulling="2026-02-18 06:03:22.507821125 +0000 UTC m=+939.155780259" observedRunningTime="2026-02-18 06:03:23.366423861 +0000 UTC m=+940.014383015" watchObservedRunningTime="2026-02-18 06:03:23.36975288 +0000 UTC m=+940.017712034" Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.387599 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" podStartSLOduration=2.577561984 podStartE2EDuration="32.387571737s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:52.723572941 +0000 UTC m=+909.371532075" lastFinishedPulling="2026-02-18 06:03:22.533582694 +0000 UTC m=+939.181541828" observedRunningTime="2026-02-18 06:03:23.385771888 +0000 UTC m=+940.033731032" watchObservedRunningTime="2026-02-18 06:03:23.387571737 +0000 UTC m=+940.035530911" Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.598991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.606236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8078f629-a80e-4f59-b84a-33144cc5b0c6-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd\" (UID: \"8078f629-a80e-4f59-b84a-33144cc5b0c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:03:23 crc kubenswrapper[4707]: I0218 06:03:23.712816 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.004664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.005028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.009243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-metrics-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.009766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8f9f5f4-3cdb-4b04-bc52-26acb4dda227-webhook-certs\") pod \"openstack-operator-controller-manager-dccc9b448-chjxf\" (UID: \"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227\") " pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.048212 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.243557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd"] Feb 18 06:03:24 crc kubenswrapper[4707]: W0218 06:03:24.257376 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8078f629_a80e_4f59_b84a_33144cc5b0c6.slice/crio-a52d7e687601e820a81ccca816e14db8a9ce5b493753c6d53149785b86c86400 WatchSource:0}: Error finding container a52d7e687601e820a81ccca816e14db8a9ce5b493753c6d53149785b86c86400: Status 404 returned error can't find the container with id a52d7e687601e820a81ccca816e14db8a9ce5b493753c6d53149785b86c86400 Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.365582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" event={"ID":"8078f629-a80e-4f59-b84a-33144cc5b0c6","Type":"ContainerStarted","Data":"a52d7e687601e820a81ccca816e14db8a9ce5b493753c6d53149785b86c86400"} Feb 18 06:03:24 crc kubenswrapper[4707]: I0218 06:03:24.410356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf"] Feb 18 06:03:24 crc kubenswrapper[4707]: W0218 06:03:24.414140 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f9f5f4_3cdb_4b04_bc52_26acb4dda227.slice/crio-d3da9e51c502143e49cb949afd1066a00250457747bc18f788406449b506514e WatchSource:0}: Error finding container d3da9e51c502143e49cb949afd1066a00250457747bc18f788406449b506514e: Status 404 returned error can't find the container with id d3da9e51c502143e49cb949afd1066a00250457747bc18f788406449b506514e Feb 18 06:03:25 crc kubenswrapper[4707]: I0218 06:03:25.371941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" event={"ID":"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227","Type":"ContainerStarted","Data":"d3da9e51c502143e49cb949afd1066a00250457747bc18f788406449b506514e"} Feb 18 06:03:27 crc kubenswrapper[4707]: I0218 06:03:27.275878 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" Feb 18 06:03:29 crc kubenswrapper[4707]: I0218 06:03:29.402018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" event={"ID":"c8f9f5f4-3cdb-4b04-bc52-26acb4dda227","Type":"ContainerStarted","Data":"c15e5694c94d2f96731397f5394c486572374fdd33b0a7819240ec0656c50db8"} Feb 18 06:03:29 crc kubenswrapper[4707]: I0218 06:03:29.402870 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:29 crc kubenswrapper[4707]: I0218 06:03:29.434903 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" podStartSLOduration=38.434880906 podStartE2EDuration="38.434880906s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:03:29.432189914 +0000 UTC m=+946.080149068" watchObservedRunningTime="2026-02-18 06:03:29.434880906 +0000 UTC m=+946.082840050" Feb 18 06:03:30 crc kubenswrapper[4707]: I0218 06:03:30.409187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" event={"ID":"dc38a034-90cc-4976-93dd-ae54d298b574","Type":"ContainerStarted","Data":"0dba94f8e61208e70067491e1d7fb9570b68f38bb7d7bc67eeea751f4ab3cdc1"} Feb 18 06:03:30 crc kubenswrapper[4707]: I0218 06:03:30.410088 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" Feb 18 06:03:30 crc kubenswrapper[4707]: I0218 06:03:30.425521 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" podStartSLOduration=2.416234509 podStartE2EDuration="39.425500898s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:02:53.109363059 +0000 UTC m=+909.757322193" lastFinishedPulling="2026-02-18 06:03:30.118629448 +0000 UTC m=+946.766588582" observedRunningTime="2026-02-18 06:03:30.423505436 +0000 UTC m=+947.071464570" watchObservedRunningTime="2026-02-18 06:03:30.425500898 +0000 UTC m=+947.073460032" Feb 18 06:03:31 crc kubenswrapper[4707]: I0218 06:03:31.418025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" event={"ID":"8078f629-a80e-4f59-b84a-33144cc5b0c6","Type":"ContainerStarted","Data":"7bfa6f07a4d818f9810176b1cd1c186a052d4889c482506c6b4831fc87b2017d"} Feb 18 06:03:31 crc kubenswrapper[4707]: I0218 06:03:31.418559 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:03:31 crc kubenswrapper[4707]: I0218 06:03:31.459202 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" podStartSLOduration=33.611269259 podStartE2EDuration="40.459186334s" podCreationTimestamp="2026-02-18 06:02:51 +0000 UTC" firstStartedPulling="2026-02-18 06:03:24.261751268 +0000 UTC m=+940.909710402" lastFinishedPulling="2026-02-18 06:03:31.109668343 +0000 UTC m=+947.757627477" observedRunningTime="2026-02-18 06:03:31.455225748 +0000 UTC m=+948.103184892" watchObservedRunningTime="2026-02-18 06:03:31.459186334 +0000 UTC m=+948.107145468" Feb 18 06:03:31 crc kubenswrapper[4707]: I0218 06:03:31.714213 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6f96w" Feb 18 06:03:31 crc kubenswrapper[4707]: I0218 06:03:31.888250 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6lrh7" Feb 18 06:03:31 crc kubenswrapper[4707]: I0218 06:03:31.981654 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-2hv82" Feb 18 06:03:32 crc kubenswrapper[4707]: I0218 06:03:32.221200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-48k7j" Feb 18 06:03:34 crc kubenswrapper[4707]: I0218 06:03:34.067293 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-dccc9b448-chjxf" Feb 18 06:03:41 crc kubenswrapper[4707]: I0218 06:03:41.942029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-twhwz" Feb 18 06:03:43 crc kubenswrapper[4707]: I0218 06:03:43.719673 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.742422 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zn7r"] Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.744985 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.756445 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.756685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-c486g" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.777987 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zn7r"] Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.832029 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nxsfq"] Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.833089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.842435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.842490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdtf\" (UniqueName: \"kubernetes.io/projected/0c3c596e-bab9-4b34-abb6-1c7631d496d6-kube-api-access-spdtf\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.842544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-config\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.842585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-config\") pod \"dnsmasq-dns-675f4bcbfc-2zn7r\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.842621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8w9\" (UniqueName: \"kubernetes.io/projected/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-kube-api-access-cq8w9\") pod \"dnsmasq-dns-675f4bcbfc-2zn7r\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.842990 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nxsfq"] Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.851588 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.943940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.944220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spdtf\" (UniqueName: \"kubernetes.io/projected/0c3c596e-bab9-4b34-abb6-1c7631d496d6-kube-api-access-spdtf\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.944318 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-config\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.944402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-config\") pod \"dnsmasq-dns-675f4bcbfc-2zn7r\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.944494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8w9\" (UniqueName: \"kubernetes.io/projected/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-kube-api-access-cq8w9\") pod \"dnsmasq-dns-675f4bcbfc-2zn7r\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.945491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-config\") pod \"dnsmasq-dns-675f4bcbfc-2zn7r\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.945657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.946771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-config\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.977682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdtf\" (UniqueName: \"kubernetes.io/projected/0c3c596e-bab9-4b34-abb6-1c7631d496d6-kube-api-access-spdtf\") pod \"dnsmasq-dns-78dd6ddcc-nxsfq\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:03:59 crc kubenswrapper[4707]: I0218 06:03:59.977765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8w9\" (UniqueName: \"kubernetes.io/projected/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-kube-api-access-cq8w9\") pod \"dnsmasq-dns-675f4bcbfc-2zn7r\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:04:00 crc kubenswrapper[4707]: I0218 06:04:00.079542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:04:00 crc kubenswrapper[4707]: I0218 06:04:00.162612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:04:00 crc kubenswrapper[4707]: I0218 06:04:00.534390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zn7r"] Feb 18 06:04:00 crc kubenswrapper[4707]: I0218 06:04:00.626743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" event={"ID":"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc","Type":"ContainerStarted","Data":"d4e56ca7f802d8cd2e50efbc0bcc90a93c0c9d3ec33a6ff30d1b45f5bbefba3e"} Feb 18 06:04:00 crc kubenswrapper[4707]: I0218 06:04:00.771894 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nxsfq"] Feb 18 06:04:00 crc kubenswrapper[4707]: W0218 06:04:00.772884 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c3c596e_bab9_4b34_abb6_1c7631d496d6.slice/crio-b9282ceadc5e34837ce4134c5051a4d16c6bf5c8d471b3e75eeaf3f17f81a16f WatchSource:0}: Error finding container b9282ceadc5e34837ce4134c5051a4d16c6bf5c8d471b3e75eeaf3f17f81a16f: Status 404 returned error can't find the container with id b9282ceadc5e34837ce4134c5051a4d16c6bf5c8d471b3e75eeaf3f17f81a16f Feb 18 06:04:01 crc kubenswrapper[4707]: I0218 06:04:01.656297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" event={"ID":"0c3c596e-bab9-4b34-abb6-1c7631d496d6","Type":"ContainerStarted","Data":"b9282ceadc5e34837ce4134c5051a4d16c6bf5c8d471b3e75eeaf3f17f81a16f"} Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.566166 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zn7r"] Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.588346 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nvpt7"] Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.589390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.601735 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nvpt7"] Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.694803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvvtz\" (UniqueName: \"kubernetes.io/projected/cb7dd0a9-7e95-4705-bee7-074d6862f9de-kube-api-access-hvvtz\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.694873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.694932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-config\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.799351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvvtz\" (UniqueName: \"kubernetes.io/projected/cb7dd0a9-7e95-4705-bee7-074d6862f9de-kube-api-access-hvvtz\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.799406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.799468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-config\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.800320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-config\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.801059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.843642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvvtz\" (UniqueName: \"kubernetes.io/projected/cb7dd0a9-7e95-4705-bee7-074d6862f9de-kube-api-access-hvvtz\") pod \"dnsmasq-dns-666b6646f7-nvpt7\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.895170 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nxsfq"] Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.913809 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.922740 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q9hkt"] Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.930562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q9hkt"] Feb 18 06:04:02 crc kubenswrapper[4707]: I0218 06:04:02.930856 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.110560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.110620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n79x2\" (UniqueName: \"kubernetes.io/projected/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-kube-api-access-n79x2\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.110693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-config\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.212649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-config\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.212747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.212803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n79x2\" (UniqueName: \"kubernetes.io/projected/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-kube-api-access-n79x2\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.214360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.214520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-config\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.245501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n79x2\" (UniqueName: \"kubernetes.io/projected/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-kube-api-access-n79x2\") pod \"dnsmasq-dns-57d769cc4f-q9hkt\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.257372 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.654438 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nvpt7"] Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.688526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" event={"ID":"cb7dd0a9-7e95-4705-bee7-074d6862f9de","Type":"ContainerStarted","Data":"5f9f501ab0ba4b0647e04ca2aad7f20514fb488a24b570e4d72dd026fc028bd3"} Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.742886 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.744685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.748339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.748488 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.748541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.748623 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.748695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kc457" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.748764 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.748850 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.750791 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.839834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.839884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/298a4b48-6611-4cb4-8ccf-e9a00c23622b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.839906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.839929 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.839946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.839961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5zg6\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-kube-api-access-c5zg6\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.840001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.840017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-config-data\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.840037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.840068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/298a4b48-6611-4cb4-8ccf-e9a00c23622b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.840268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/298a4b48-6611-4cb4-8ccf-e9a00c23622b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941868 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5zg6\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-kube-api-access-c5zg6\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-config-data\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/298a4b48-6611-4cb4-8ccf-e9a00c23622b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.941997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.942020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.942921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.942941 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.943265 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.943429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.943943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-config-data\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.944935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.949593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/298a4b48-6611-4cb4-8ccf-e9a00c23622b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.949713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/298a4b48-6611-4cb4-8ccf-e9a00c23622b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.950125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.951449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:03 crc kubenswrapper[4707]: I0218 06:04:03.992599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5zg6\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-kube-api-access-c5zg6\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:03.999634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " pod="openstack/rabbitmq-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.077176 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.089904 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q9hkt"] Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.105863 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.107318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.121458 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.121516 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.121768 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.122014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-j86kp" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.122153 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.122312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.122431 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.156139 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.256865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.256983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/536dddc2-0691-4171-98b1-1462ddf6b38a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwx2c\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-kube-api-access-gwx2c\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/536dddc2-0691-4171-98b1-1462ddf6b38a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.257724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.258107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.359841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.359933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwx2c\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-kube-api-access-gwx2c\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/536dddc2-0691-4171-98b1-1462ddf6b38a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/536dddc2-0691-4171-98b1-1462ddf6b38a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.360847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.361531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.367197 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.370751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.373313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.373782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.374044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.373799 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.380325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/536dddc2-0691-4171-98b1-1462ddf6b38a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.393366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/536dddc2-0691-4171-98b1-1462ddf6b38a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.393871 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwx2c\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-kube-api-access-gwx2c\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.418691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.465398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.730305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" event={"ID":"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c","Type":"ContainerStarted","Data":"b809f7911f969e7763dac5f2775feea5f6d182fbc10e62558b9bd572a2579469"} Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.757588 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:04:04 crc kubenswrapper[4707]: W0218 06:04:04.759165 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod298a4b48_6611_4cb4_8ccf_e9a00c23622b.slice/crio-6f0f234a7ab4ef25717bc7a190e41071305ad93bd80646661d1136fb5827d3d7 WatchSource:0}: Error finding container 6f0f234a7ab4ef25717bc7a190e41071305ad93bd80646661d1136fb5827d3d7: Status 404 returned error can't find the container with id 6f0f234a7ab4ef25717bc7a190e41071305ad93bd80646661d1136fb5827d3d7 Feb 18 06:04:04 crc kubenswrapper[4707]: I0218 06:04:04.979254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.185967 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.187287 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.195537 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.195782 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.195892 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.195943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-g4pw4" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.199926 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.205217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.291223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77pn\" (UniqueName: \"kubernetes.io/projected/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-kube-api-access-k77pn\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.291295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.291327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.292121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.292157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.292354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.292423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.292508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.394273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k77pn\" (UniqueName: \"kubernetes.io/projected/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-kube-api-access-k77pn\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.394861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.394894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.394917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.394937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.394967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.394985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.395016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.401856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.402533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.402769 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.411939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.413301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.415326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.426425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.447091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77pn\" (UniqueName: \"kubernetes.io/projected/7ee6297b-9af9-40fd-90e0-edcb0c08f6e8-kube-api-access-k77pn\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.448523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8\") " pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.527172 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.778632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"536dddc2-0691-4171-98b1-1462ddf6b38a","Type":"ContainerStarted","Data":"af06aeec8837a106294b84c5c2d4d59123a901928ae74a2dabda32ca4d8247e1"} Feb 18 06:04:05 crc kubenswrapper[4707]: I0218 06:04:05.783451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"298a4b48-6611-4cb4-8ccf-e9a00c23622b","Type":"ContainerStarted","Data":"6f0f234a7ab4ef25717bc7a190e41071305ad93bd80646661d1136fb5827d3d7"} Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.254352 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.545878 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.547223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.547314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.555378 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.555998 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.556154 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.556279 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gkpzm" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c5a172-7c7d-407a-b727-0f982f82680c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/28c5a172-7c7d-407a-b727-0f982f82680c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czt4f\" (UniqueName: \"kubernetes.io/projected/28c5a172-7c7d-407a-b727-0f982f82680c-kube-api-access-czt4f\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/28c5a172-7c7d-407a-b727-0f982f82680c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.720451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.805182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8","Type":"ContainerStarted","Data":"32537a20b30a36be6dbcea10dfbac351236e9880d58e0c6050ec50323406d67c"} Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c5a172-7c7d-407a-b727-0f982f82680c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/28c5a172-7c7d-407a-b727-0f982f82680c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czt4f\" (UniqueName: \"kubernetes.io/projected/28c5a172-7c7d-407a-b727-0f982f82680c-kube-api-access-czt4f\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/28c5a172-7c7d-407a-b727-0f982f82680c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.821780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.822116 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.823169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/28c5a172-7c7d-407a-b727-0f982f82680c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.823565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.824004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.824078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/28c5a172-7c7d-407a-b727-0f982f82680c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.832639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c5a172-7c7d-407a-b727-0f982f82680c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.846466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/28c5a172-7c7d-407a-b727-0f982f82680c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.858582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.866124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czt4f\" (UniqueName: \"kubernetes.io/projected/28c5a172-7c7d-407a-b727-0f982f82680c-kube-api-access-czt4f\") pod \"openstack-cell1-galera-0\" (UID: \"28c5a172-7c7d-407a-b727-0f982f82680c\") " pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.892456 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.917381 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.918327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.925683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.925940 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.927956 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-j8mt8" Feb 18 06:04:06 crc kubenswrapper[4707]: I0218 06:04:06.933208 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.024590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr929\" (UniqueName: \"kubernetes.io/projected/4a3a1b52-c364-480e-a60b-8bc313f3002d-kube-api-access-wr929\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.024680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a3a1b52-c364-480e-a60b-8bc313f3002d-config-data\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.024704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a3a1b52-c364-480e-a60b-8bc313f3002d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.024729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a3a1b52-c364-480e-a60b-8bc313f3002d-kolla-config\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.024752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a3a1b52-c364-480e-a60b-8bc313f3002d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.129143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr929\" (UniqueName: \"kubernetes.io/projected/4a3a1b52-c364-480e-a60b-8bc313f3002d-kube-api-access-wr929\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.129281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a3a1b52-c364-480e-a60b-8bc313f3002d-config-data\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.129303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a3a1b52-c364-480e-a60b-8bc313f3002d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.129328 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a3a1b52-c364-480e-a60b-8bc313f3002d-kolla-config\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.129344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a3a1b52-c364-480e-a60b-8bc313f3002d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.130359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a3a1b52-c364-480e-a60b-8bc313f3002d-config-data\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.130878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a3a1b52-c364-480e-a60b-8bc313f3002d-kolla-config\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.133649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a3a1b52-c364-480e-a60b-8bc313f3002d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.147340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a3a1b52-c364-480e-a60b-8bc313f3002d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.149665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr929\" (UniqueName: \"kubernetes.io/projected/4a3a1b52-c364-480e-a60b-8bc313f3002d-kube-api-access-wr929\") pod \"memcached-0\" (UID: \"4a3a1b52-c364-480e-a60b-8bc313f3002d\") " pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.268758 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.653427 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 06:04:07 crc kubenswrapper[4707]: W0218 06:04:07.669455 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28c5a172_7c7d_407a_b727_0f982f82680c.slice/crio-ce8a533760eab02b4a442656d1a3a19c3e7dff36f76174c796f5f3dfdc7db9cd WatchSource:0}: Error finding container ce8a533760eab02b4a442656d1a3a19c3e7dff36f76174c796f5f3dfdc7db9cd: Status 404 returned error can't find the container with id ce8a533760eab02b4a442656d1a3a19c3e7dff36f76174c796f5f3dfdc7db9cd Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.829184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"28c5a172-7c7d-407a-b727-0f982f82680c","Type":"ContainerStarted","Data":"ce8a533760eab02b4a442656d1a3a19c3e7dff36f76174c796f5f3dfdc7db9cd"} Feb 18 06:04:07 crc kubenswrapper[4707]: I0218 06:04:07.936025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 06:04:08 crc kubenswrapper[4707]: W0218 06:04:08.021485 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a3a1b52_c364_480e_a60b_8bc313f3002d.slice/crio-20810580a6ffcf9f918c11846557e1c96ee9529d887ec6b941b21a7473f34935 WatchSource:0}: Error finding container 20810580a6ffcf9f918c11846557e1c96ee9529d887ec6b941b21a7473f34935: Status 404 returned error can't find the container with id 20810580a6ffcf9f918c11846557e1c96ee9529d887ec6b941b21a7473f34935 Feb 18 06:04:08 crc kubenswrapper[4707]: I0218 06:04:08.872169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4a3a1b52-c364-480e-a60b-8bc313f3002d","Type":"ContainerStarted","Data":"20810580a6ffcf9f918c11846557e1c96ee9529d887ec6b941b21a7473f34935"} Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.382719 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.384454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.387092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-m9mcl" Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.396073 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.503104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r24j6\" (UniqueName: \"kubernetes.io/projected/5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b-kube-api-access-r24j6\") pod \"kube-state-metrics-0\" (UID: \"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b\") " pod="openstack/kube-state-metrics-0" Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.604634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r24j6\" (UniqueName: \"kubernetes.io/projected/5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b-kube-api-access-r24j6\") pod \"kube-state-metrics-0\" (UID: \"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b\") " pod="openstack/kube-state-metrics-0" Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.637138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r24j6\" (UniqueName: \"kubernetes.io/projected/5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b-kube-api-access-r24j6\") pod \"kube-state-metrics-0\" (UID: \"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b\") " pod="openstack/kube-state-metrics-0" Feb 18 06:04:09 crc kubenswrapper[4707]: I0218 06:04:09.741553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.915955 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-297xq"] Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.917191 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-297xq" Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.924963 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7rx4b" Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.932604 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-297xq"] Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.938480 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-f95ql"] Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.925510 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.925599 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 18 06:04:11 crc kubenswrapper[4707]: I0218 06:04:11.940307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.005014 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-f95ql"] Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-run\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgdn5\" (UniqueName: \"kubernetes.io/projected/45dd27c5-0315-416d-99cc-197009aa5a8f-kube-api-access-fgdn5\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea5baf83-32e6-41ec-b14a-d32b3f848be6-ovn-controller-tls-certs\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-lib\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-etc-ovs\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-log-ovn\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-log\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067486 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-run-ovn\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5baf83-32e6-41ec-b14a-d32b3f848be6-combined-ca-bundle\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxmz5\" (UniqueName: \"kubernetes.io/projected/ea5baf83-32e6-41ec-b14a-d32b3f848be6-kube-api-access-vxmz5\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5baf83-32e6-41ec-b14a-d32b3f848be6-scripts\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-run\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.067658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45dd27c5-0315-416d-99cc-197009aa5a8f-scripts\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45dd27c5-0315-416d-99cc-197009aa5a8f-scripts\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-run\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgdn5\" (UniqueName: \"kubernetes.io/projected/45dd27c5-0315-416d-99cc-197009aa5a8f-kube-api-access-fgdn5\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea5baf83-32e6-41ec-b14a-d32b3f848be6-ovn-controller-tls-certs\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-lib\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-etc-ovs\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-log-ovn\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-log\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-run-ovn\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5baf83-32e6-41ec-b14a-d32b3f848be6-combined-ca-bundle\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxmz5\" (UniqueName: \"kubernetes.io/projected/ea5baf83-32e6-41ec-b14a-d32b3f848be6-kube-api-access-vxmz5\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5baf83-32e6-41ec-b14a-d32b3f848be6-scripts\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.169825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-run\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.170396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-run\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.172164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-run\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.172599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-lib\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.172986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-var-log\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.173244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-run-ovn\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.173584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45dd27c5-0315-416d-99cc-197009aa5a8f-etc-ovs\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.173615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea5baf83-32e6-41ec-b14a-d32b3f848be6-var-log-ovn\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.176358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45dd27c5-0315-416d-99cc-197009aa5a8f-scripts\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.186678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5baf83-32e6-41ec-b14a-d32b3f848be6-scripts\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.192789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxmz5\" (UniqueName: \"kubernetes.io/projected/ea5baf83-32e6-41ec-b14a-d32b3f848be6-kube-api-access-vxmz5\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.193323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5baf83-32e6-41ec-b14a-d32b3f848be6-combined-ca-bundle\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.193935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgdn5\" (UniqueName: \"kubernetes.io/projected/45dd27c5-0315-416d-99cc-197009aa5a8f-kube-api-access-fgdn5\") pod \"ovn-controller-ovs-f95ql\" (UID: \"45dd27c5-0315-416d-99cc-197009aa5a8f\") " pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.198630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea5baf83-32e6-41ec-b14a-d32b3f848be6-ovn-controller-tls-certs\") pod \"ovn-controller-297xq\" (UID: \"ea5baf83-32e6-41ec-b14a-d32b3f848be6\") " pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.310794 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-297xq" Feb 18 06:04:12 crc kubenswrapper[4707]: I0218 06:04:12.342342 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.164042 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.165509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.170191 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.170391 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.170535 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.170643 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.170829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kv47w" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.178313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.297761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a6797f-d647-4727-ac53-df0b6d7495ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.298268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54kr4\" (UniqueName: \"kubernetes.io/projected/27a6797f-d647-4727-ac53-df0b6d7495ca-kube-api-access-54kr4\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.298309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a6797f-d647-4727-ac53-df0b6d7495ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.298336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.298417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.298466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a6797f-d647-4727-ac53-df0b6d7495ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.298497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.298538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a6797f-d647-4727-ac53-df0b6d7495ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a6797f-d647-4727-ac53-df0b6d7495ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a6797f-d647-4727-ac53-df0b6d7495ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.400611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54kr4\" (UniqueName: \"kubernetes.io/projected/27a6797f-d647-4727-ac53-df0b6d7495ca-kube-api-access-54kr4\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.401619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a6797f-d647-4727-ac53-df0b6d7495ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.406193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.406212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.406433 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.409758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a6797f-d647-4727-ac53-df0b6d7495ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.410644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a6797f-d647-4727-ac53-df0b6d7495ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.411484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a6797f-d647-4727-ac53-df0b6d7495ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.427145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54kr4\" (UniqueName: \"kubernetes.io/projected/27a6797f-d647-4727-ac53-df0b6d7495ca-kube-api-access-54kr4\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.433034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a6797f-d647-4727-ac53-df0b6d7495ca\") " pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:13 crc kubenswrapper[4707]: I0218 06:04:13.510042 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.170357 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.172667 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.175995 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.176778 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.177069 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.177241 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4t2ch" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.186564 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.272756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2df84df-113d-42d3-b7e7-ee6d01888dd9-config\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.272875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2df84df-113d-42d3-b7e7-ee6d01888dd9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.273018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.273061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.273104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2df84df-113d-42d3-b7e7-ee6d01888dd9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.273121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.273260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vp8\" (UniqueName: \"kubernetes.io/projected/c2df84df-113d-42d3-b7e7-ee6d01888dd9-kube-api-access-h2vp8\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.273344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.374963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2df84df-113d-42d3-b7e7-ee6d01888dd9-config\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2df84df-113d-42d3-b7e7-ee6d01888dd9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2df84df-113d-42d3-b7e7-ee6d01888dd9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vp8\" (UniqueName: \"kubernetes.io/projected/c2df84df-113d-42d3-b7e7-ee6d01888dd9-kube-api-access-h2vp8\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.375581 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.376037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2df84df-113d-42d3-b7e7-ee6d01888dd9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.376413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2df84df-113d-42d3-b7e7-ee6d01888dd9-config\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.376689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2df84df-113d-42d3-b7e7-ee6d01888dd9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.388206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.394506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.402168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2df84df-113d-42d3-b7e7-ee6d01888dd9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.402511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vp8\" (UniqueName: \"kubernetes.io/projected/c2df84df-113d-42d3-b7e7-ee6d01888dd9-kube-api-access-h2vp8\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.429276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c2df84df-113d-42d3-b7e7-ee6d01888dd9\") " pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:16 crc kubenswrapper[4707]: I0218 06:04:16.504712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:32 crc kubenswrapper[4707]: E0218 06:04:32.390071 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 18 06:04:32 crc kubenswrapper[4707]: E0218 06:04:32.390805 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gwx2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(536dddc2-0691-4171-98b1-1462ddf6b38a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:32 crc kubenswrapper[4707]: E0218 06:04:32.391984 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" Feb 18 06:04:33 crc kubenswrapper[4707]: E0218 06:04:33.132612 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.198674 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.199163 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-czt4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(28c5a172-7c7d-407a-b727-0f982f82680c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.200276 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="28c5a172-7c7d-407a-b727-0f982f82680c" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.225631 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.225859 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5zg6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(298a4b48-6611-4cb4-8ccf-e9a00c23622b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.227316 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.258680 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.259084 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k77pn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(7ee6297b-9af9-40fd-90e0-edcb0c08f6e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:34 crc kubenswrapper[4707]: E0218 06:04:34.260852 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="7ee6297b-9af9-40fd-90e0-edcb0c08f6e8" Feb 18 06:04:34 crc kubenswrapper[4707]: I0218 06:04:34.756728 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.030610 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.030848 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-spdtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-nxsfq_openstack(0c3c596e-bab9-4b34-abb6-1c7631d496d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.032782 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" podUID="0c3c596e-bab9-4b34-abb6-1c7631d496d6" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.039197 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.039292 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvvtz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-nvpt7_openstack(cb7dd0a9-7e95-4705-bee7-074d6862f9de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.040830 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" podUID="cb7dd0a9-7e95-4705-bee7-074d6862f9de" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.064098 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.064709 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cq8w9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2zn7r_openstack(b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.065993 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" podUID="b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.066065 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.066163 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n79x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-q9hkt_openstack(e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.067400 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" podUID="e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.144554 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" podUID="cb7dd0a9-7e95-4705-bee7-074d6862f9de" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.144591 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" podUID="e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.144934 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="7ee6297b-9af9-40fd-90e0-edcb0c08f6e8" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.146150 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="28c5a172-7c7d-407a-b727-0f982f82680c" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.147877 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.929994 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.930179 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n8h64dh599hd4h55ch596h7h699h544h5b9h578h5d8h5bh5ffh65h694h8bh54ch586h67fh587h547h5cdh5bh68fh89h5c7h58fh5c4hc6h698h645q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wr929,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(4a3a1b52-c364-480e-a60b-8bc313f3002d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:04:35 crc kubenswrapper[4707]: E0218 06:04:35.931905 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="4a3a1b52-c364-480e-a60b-8bc313f3002d" Feb 18 06:04:35 crc kubenswrapper[4707]: W0218 06:04:35.936596 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2df84df_113d_42d3_b7e7_ee6d01888dd9.slice/crio-c348cae5a90953fdc168b74245c6fe9feff07dd0016e3e5bf81c4674e776fdb5 WatchSource:0}: Error finding container c348cae5a90953fdc168b74245c6fe9feff07dd0016e3e5bf81c4674e776fdb5: Status 404 returned error can't find the container with id c348cae5a90953fdc168b74245c6fe9feff07dd0016e3e5bf81c4674e776fdb5 Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.036549 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.038015 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.134448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-dns-svc\") pod \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.134525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdtf\" (UniqueName: \"kubernetes.io/projected/0c3c596e-bab9-4b34-abb6-1c7631d496d6-kube-api-access-spdtf\") pod \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.134572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-config\") pod \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\" (UID: \"0c3c596e-bab9-4b34-abb6-1c7631d496d6\") " Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.134599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-config\") pod \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.134664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq8w9\" (UniqueName: \"kubernetes.io/projected/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-kube-api-access-cq8w9\") pod \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\" (UID: \"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc\") " Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.136717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-config" (OuterVolumeSpecName: "config") pod "0c3c596e-bab9-4b34-abb6-1c7631d496d6" (UID: "0c3c596e-bab9-4b34-abb6-1c7631d496d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.136845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-config" (OuterVolumeSpecName: "config") pod "b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc" (UID: "b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.136880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c3c596e-bab9-4b34-abb6-1c7631d496d6" (UID: "0c3c596e-bab9-4b34-abb6-1c7631d496d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.143374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-kube-api-access-cq8w9" (OuterVolumeSpecName: "kube-api-access-cq8w9") pod "b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc" (UID: "b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc"). InnerVolumeSpecName "kube-api-access-cq8w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.143460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3c596e-bab9-4b34-abb6-1c7631d496d6-kube-api-access-spdtf" (OuterVolumeSpecName: "kube-api-access-spdtf") pod "0c3c596e-bab9-4b34-abb6-1c7631d496d6" (UID: "0c3c596e-bab9-4b34-abb6-1c7631d496d6"). InnerVolumeSpecName "kube-api-access-spdtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.158578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.159833 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" Feb 18 06:04:36 crc kubenswrapper[4707]: E0218 06:04:36.164974 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="4a3a1b52-c364-480e-a60b-8bc313f3002d" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.173541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2zn7r" event={"ID":"b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc","Type":"ContainerDied","Data":"d4e56ca7f802d8cd2e50efbc0bcc90a93c0c9d3ec33a6ff30d1b45f5bbefba3e"} Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.173609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-nxsfq" event={"ID":"0c3c596e-bab9-4b34-abb6-1c7631d496d6","Type":"ContainerDied","Data":"b9282ceadc5e34837ce4134c5051a4d16c6bf5c8d471b3e75eeaf3f17f81a16f"} Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.173625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c2df84df-113d-42d3-b7e7-ee6d01888dd9","Type":"ContainerStarted","Data":"c348cae5a90953fdc168b74245c6fe9feff07dd0016e3e5bf81c4674e776fdb5"} Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.236043 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq8w9\" (UniqueName: \"kubernetes.io/projected/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-kube-api-access-cq8w9\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.236073 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.236086 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spdtf\" (UniqueName: \"kubernetes.io/projected/0c3c596e-bab9-4b34-abb6-1c7631d496d6-kube-api-access-spdtf\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.236096 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3c596e-bab9-4b34-abb6-1c7631d496d6-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.236106 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.308112 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nxsfq"] Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.382071 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nxsfq"] Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.455884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zn7r"] Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.465255 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zn7r"] Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.555039 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.658566 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-297xq"] Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.668348 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-f95ql"] Feb 18 06:04:36 crc kubenswrapper[4707]: I0218 06:04:36.747665 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:04:36 crc kubenswrapper[4707]: W0218 06:04:36.753262 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5599a4a6_2d3b_4bab_a1f8_bc87c27f7b5b.slice/crio-e07c5e7a395476362faae959d8c903459c219e7c9cf988eb65a6d5df12447329 WatchSource:0}: Error finding container e07c5e7a395476362faae959d8c903459c219e7c9cf988eb65a6d5df12447329: Status 404 returned error can't find the container with id e07c5e7a395476362faae959d8c903459c219e7c9cf988eb65a6d5df12447329 Feb 18 06:04:37 crc kubenswrapper[4707]: I0218 06:04:37.171527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f95ql" event={"ID":"45dd27c5-0315-416d-99cc-197009aa5a8f","Type":"ContainerStarted","Data":"f65ab565842aca86f5dd179c118e30ed193acc09cc4a8933acfd7f910fd05d8c"} Feb 18 06:04:37 crc kubenswrapper[4707]: I0218 06:04:37.172782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"27a6797f-d647-4727-ac53-df0b6d7495ca","Type":"ContainerStarted","Data":"c03ed10b5a8ed33228f24b6da4339a91ff832af0be4ca74777d1490abfb5318d"} Feb 18 06:04:37 crc kubenswrapper[4707]: I0218 06:04:37.173814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b","Type":"ContainerStarted","Data":"e07c5e7a395476362faae959d8c903459c219e7c9cf988eb65a6d5df12447329"} Feb 18 06:04:37 crc kubenswrapper[4707]: I0218 06:04:37.175127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-297xq" event={"ID":"ea5baf83-32e6-41ec-b14a-d32b3f848be6","Type":"ContainerStarted","Data":"c9cc04078cbed141f76cec4109227633e2cf59740cdfcdf2794f3f0708de12c4"} Feb 18 06:04:38 crc kubenswrapper[4707]: I0218 06:04:38.064216 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3c596e-bab9-4b34-abb6-1c7631d496d6" path="/var/lib/kubelet/pods/0c3c596e-bab9-4b34-abb6-1c7631d496d6/volumes" Feb 18 06:04:38 crc kubenswrapper[4707]: I0218 06:04:38.064609 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc" path="/var/lib/kubelet/pods/b69cbcd8-f3c1-42f6-8c6d-b9244f5d48dc/volumes" Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.219884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b","Type":"ContainerStarted","Data":"df83f4762c58d675d988731d1991ae0c95a0096a3656759e571959a4314351b3"} Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.221294 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.224316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-297xq" event={"ID":"ea5baf83-32e6-41ec-b14a-d32b3f848be6","Type":"ContainerStarted","Data":"6ea5442afef402f46d149e03490bc51f68a1490a2a7ed5bbd2b49f4550cc04a0"} Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.224930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-297xq" Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.226635 4707 generic.go:334] "Generic (PLEG): container finished" podID="45dd27c5-0315-416d-99cc-197009aa5a8f" containerID="503022dd5daa1763ca261aa9ac473385e40ce64a8632bb4a12486d93beeec28c" exitCode=0 Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.226749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f95ql" event={"ID":"45dd27c5-0315-416d-99cc-197009aa5a8f","Type":"ContainerDied","Data":"503022dd5daa1763ca261aa9ac473385e40ce64a8632bb4a12486d93beeec28c"} Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.228857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c2df84df-113d-42d3-b7e7-ee6d01888dd9","Type":"ContainerStarted","Data":"0cfae76c78385704be6baf56c3511b0d3202cd0e54cb1244b28a6a0203f12853"} Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.238599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"27a6797f-d647-4727-ac53-df0b6d7495ca","Type":"ContainerStarted","Data":"1d9128b448020f7c5a190b7c3a73ead2bb066064019e52d19e0e0ee2e400a6e9"} Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.243481 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=28.572694838 podStartE2EDuration="32.243461195s" podCreationTimestamp="2026-02-18 06:04:09 +0000 UTC" firstStartedPulling="2026-02-18 06:04:36.755410397 +0000 UTC m=+1013.403369531" lastFinishedPulling="2026-02-18 06:04:40.426176754 +0000 UTC m=+1017.074135888" observedRunningTime="2026-02-18 06:04:41.238993706 +0000 UTC m=+1017.886952840" watchObservedRunningTime="2026-02-18 06:04:41.243461195 +0000 UTC m=+1017.891420329" Feb 18 06:04:41 crc kubenswrapper[4707]: I0218 06:04:41.260334 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-297xq" podStartSLOduration=27.274399691 podStartE2EDuration="30.260317998s" podCreationTimestamp="2026-02-18 06:04:11 +0000 UTC" firstStartedPulling="2026-02-18 06:04:36.664080687 +0000 UTC m=+1013.312039811" lastFinishedPulling="2026-02-18 06:04:39.649998964 +0000 UTC m=+1016.297958118" observedRunningTime="2026-02-18 06:04:41.256025232 +0000 UTC m=+1017.903984386" watchObservedRunningTime="2026-02-18 06:04:41.260317998 +0000 UTC m=+1017.908277132" Feb 18 06:04:42 crc kubenswrapper[4707]: I0218 06:04:42.253985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f95ql" event={"ID":"45dd27c5-0315-416d-99cc-197009aa5a8f","Type":"ContainerStarted","Data":"13aa5af51f7c3850f5b07fb99e3690d81401a3f3d47792736bcc4ae2de3af850"} Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.265205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"27a6797f-d647-4727-ac53-df0b6d7495ca","Type":"ContainerStarted","Data":"f9c58aeb8841d6c473f9b5e93a97f8c84de423e4a36fb39121278470ff238ad7"} Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.271967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f95ql" event={"ID":"45dd27c5-0315-416d-99cc-197009aa5a8f","Type":"ContainerStarted","Data":"a2dc8950b9f08789e3ede14e2a9c77c458b49584c6d6b8ad53d8a3b6bf3fea6b"} Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.272251 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.276900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c2df84df-113d-42d3-b7e7-ee6d01888dd9","Type":"ContainerStarted","Data":"835a81084a15808613e549965b02118bd626dfa14f8a405b18a3b628e9df1832"} Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.290703 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.794900193 podStartE2EDuration="31.290674464s" podCreationTimestamp="2026-02-18 06:04:12 +0000 UTC" firstStartedPulling="2026-02-18 06:04:36.581929532 +0000 UTC m=+1013.229888666" lastFinishedPulling="2026-02-18 06:04:42.077703803 +0000 UTC m=+1018.725662937" observedRunningTime="2026-02-18 06:04:43.287228171 +0000 UTC m=+1019.935187325" watchObservedRunningTime="2026-02-18 06:04:43.290674464 +0000 UTC m=+1019.938633608" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.320756 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-f95ql" podStartSLOduration=29.340616118 podStartE2EDuration="32.32072851s" podCreationTimestamp="2026-02-18 06:04:11 +0000 UTC" firstStartedPulling="2026-02-18 06:04:36.665928496 +0000 UTC m=+1013.313887630" lastFinishedPulling="2026-02-18 06:04:39.646040878 +0000 UTC m=+1016.294000022" observedRunningTime="2026-02-18 06:04:43.316131117 +0000 UTC m=+1019.964090291" watchObservedRunningTime="2026-02-18 06:04:43.32072851 +0000 UTC m=+1019.968687654" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.344474 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=22.200175473 podStartE2EDuration="28.344451116s" podCreationTimestamp="2026-02-18 06:04:15 +0000 UTC" firstStartedPulling="2026-02-18 06:04:35.94462679 +0000 UTC m=+1012.592585924" lastFinishedPulling="2026-02-18 06:04:42.088902433 +0000 UTC m=+1018.736861567" observedRunningTime="2026-02-18 06:04:43.337411658 +0000 UTC m=+1019.985370792" watchObservedRunningTime="2026-02-18 06:04:43.344451116 +0000 UTC m=+1019.992410250" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.505984 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.510764 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.510816 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.547342 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:43 crc kubenswrapper[4707]: I0218 06:04:43.551421 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:44 crc kubenswrapper[4707]: I0218 06:04:44.286374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:44 crc kubenswrapper[4707]: I0218 06:04:44.286418 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.333429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.350035 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.639221 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q9hkt"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.700073 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-m2jp7"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.703729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.710216 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.715879 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-f25c6"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.718158 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.720941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-m2jp7"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.723253 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.729551 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-f25c6"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.762409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltqsb\" (UniqueName: \"kubernetes.io/projected/866d1055-f899-4a65-a353-366bf3a303bf-kube-api-access-ltqsb\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.762454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/866d1055-f899-4a65-a353-366bf3a303bf-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.762472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866d1055-f899-4a65-a353-366bf3a303bf-combined-ca-bundle\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.762490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/866d1055-f899-4a65-a353-366bf3a303bf-ovn-rundir\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.762577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866d1055-f899-4a65-a353-366bf3a303bf-config\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.762596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/866d1055-f899-4a65-a353-366bf3a303bf-ovs-rundir\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.790879 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.792189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.794822 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.794858 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.796108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4458g" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.796252 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.797759 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.839996 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nvpt7"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffsx\" (UniqueName: \"kubernetes.io/projected/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-kube-api-access-kffsx\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864363 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-config\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssj75\" (UniqueName: \"kubernetes.io/projected/250e525d-abb2-4374-89d4-3b16602fc351-kube-api-access-ssj75\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/250e525d-abb2-4374-89d4-3b16602fc351-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/250e525d-abb2-4374-89d4-3b16602fc351-config\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866d1055-f899-4a65-a353-366bf3a303bf-config\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/866d1055-f899-4a65-a353-366bf3a303bf-ovs-rundir\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250e525d-abb2-4374-89d4-3b16602fc351-scripts\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltqsb\" (UniqueName: \"kubernetes.io/projected/866d1055-f899-4a65-a353-366bf3a303bf-kube-api-access-ltqsb\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/866d1055-f899-4a65-a353-366bf3a303bf-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866d1055-f899-4a65-a353-366bf3a303bf-combined-ca-bundle\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/866d1055-f899-4a65-a353-366bf3a303bf-ovn-rundir\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.864737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.865550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866d1055-f899-4a65-a353-366bf3a303bf-config\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.865850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/866d1055-f899-4a65-a353-366bf3a303bf-ovs-rundir\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.871944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/866d1055-f899-4a65-a353-366bf3a303bf-ovn-rundir\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.872878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/866d1055-f899-4a65-a353-366bf3a303bf-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.878330 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-m9gbr"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.879746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.882304 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.884554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866d1055-f899-4a65-a353-366bf3a303bf-combined-ca-bundle\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.899658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltqsb\" (UniqueName: \"kubernetes.io/projected/866d1055-f899-4a65-a353-366bf3a303bf-kube-api-access-ltqsb\") pod \"ovn-controller-metrics-m2jp7\" (UID: \"866d1055-f899-4a65-a353-366bf3a303bf\") " pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.934874 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-m9gbr"] Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/250e525d-abb2-4374-89d4-3b16602fc351-config\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250e525d-abb2-4374-89d4-3b16602fc351-scripts\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-config\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966522 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-dns-svc\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4hx\" (UniqueName: \"kubernetes.io/projected/456e788f-7f0e-429e-8049-e023596ef19b-kube-api-access-gt4hx\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kffsx\" (UniqueName: \"kubernetes.io/projected/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-kube-api-access-kffsx\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-config\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssj75\" (UniqueName: \"kubernetes.io/projected/250e525d-abb2-4374-89d4-3b16602fc351-kube-api-access-ssj75\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.966740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/250e525d-abb2-4374-89d4-3b16602fc351-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.967348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/250e525d-abb2-4374-89d4-3b16602fc351-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.967852 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/250e525d-abb2-4374-89d4-3b16602fc351-config\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.970243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.970445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-config\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.970850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.970854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250e525d-abb2-4374-89d4-3b16602fc351-scripts\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.975784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.975858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.976753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e525d-abb2-4374-89d4-3b16602fc351-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.988811 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffsx\" (UniqueName: \"kubernetes.io/projected/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-kube-api-access-kffsx\") pod \"dnsmasq-dns-5bf47b49b7-f25c6\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:45 crc kubenswrapper[4707]: I0218 06:04:45.994578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssj75\" (UniqueName: \"kubernetes.io/projected/250e525d-abb2-4374-89d4-3b16602fc351-kube-api-access-ssj75\") pod \"ovn-northd-0\" (UID: \"250e525d-abb2-4374-89d4-3b16602fc351\") " pod="openstack/ovn-northd-0" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.035994 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-m2jp7" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.045265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.068120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-config\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.068151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-dns-svc\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.068170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.068198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4hx\" (UniqueName: \"kubernetes.io/projected/456e788f-7f0e-429e-8049-e023596ef19b-kube-api-access-gt4hx\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.068214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.069250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-dns-svc\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.069277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.069833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-config\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.069903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.086141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4hx\" (UniqueName: \"kubernetes.io/projected/456e788f-7f0e-429e-8049-e023596ef19b-kube-api-access-gt4hx\") pod \"dnsmasq-dns-8554648995-m9gbr\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.087650 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.119448 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.169570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-dns-svc\") pod \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.169944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n79x2\" (UniqueName: \"kubernetes.io/projected/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-kube-api-access-n79x2\") pod \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.170096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c" (UID: "e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.170680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-config\") pod \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\" (UID: \"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c\") " Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.171045 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.172246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-config" (OuterVolumeSpecName: "config") pod "e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c" (UID: "e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.177994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-kube-api-access-n79x2" (OuterVolumeSpecName: "kube-api-access-n79x2") pod "e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c" (UID: "e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c"). InnerVolumeSpecName "kube-api-access-n79x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.207404 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.272471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvvtz\" (UniqueName: \"kubernetes.io/projected/cb7dd0a9-7e95-4705-bee7-074d6862f9de-kube-api-access-hvvtz\") pod \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.272621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-config\") pod \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.272666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-dns-svc\") pod \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\" (UID: \"cb7dd0a9-7e95-4705-bee7-074d6862f9de\") " Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.273068 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.273092 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n79x2\" (UniqueName: \"kubernetes.io/projected/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c-kube-api-access-n79x2\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.273891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb7dd0a9-7e95-4705-bee7-074d6862f9de" (UID: "cb7dd0a9-7e95-4705-bee7-074d6862f9de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.273908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-config" (OuterVolumeSpecName: "config") pod "cb7dd0a9-7e95-4705-bee7-074d6862f9de" (UID: "cb7dd0a9-7e95-4705-bee7-074d6862f9de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.275449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7dd0a9-7e95-4705-bee7-074d6862f9de-kube-api-access-hvvtz" (OuterVolumeSpecName: "kube-api-access-hvvtz") pod "cb7dd0a9-7e95-4705-bee7-074d6862f9de" (UID: "cb7dd0a9-7e95-4705-bee7-074d6862f9de"). InnerVolumeSpecName "kube-api-access-hvvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.303939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" event={"ID":"cb7dd0a9-7e95-4705-bee7-074d6862f9de","Type":"ContainerDied","Data":"5f9f501ab0ba4b0647e04ca2aad7f20514fb488a24b570e4d72dd026fc028bd3"} Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.304001 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nvpt7" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.307749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"536dddc2-0691-4171-98b1-1462ddf6b38a","Type":"ContainerStarted","Data":"c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b"} Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.310257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" event={"ID":"e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c","Type":"ContainerDied","Data":"b809f7911f969e7763dac5f2775feea5f6d182fbc10e62558b9bd572a2579469"} Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.310389 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q9hkt" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.347368 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.370253 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q9hkt"] Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.374190 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.374218 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb7dd0a9-7e95-4705-bee7-074d6862f9de-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.374227 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvvtz\" (UniqueName: \"kubernetes.io/projected/cb7dd0a9-7e95-4705-bee7-074d6862f9de-kube-api-access-hvvtz\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.375942 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q9hkt"] Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.401648 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nvpt7"] Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.421877 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nvpt7"] Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.522200 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-m2jp7"] Feb 18 06:04:46 crc kubenswrapper[4707]: W0218 06:04:46.525681 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9eeba54_d0d6_4fe5_8154_58afdd82f8a2.slice/crio-e3ee461c1efee7c385c5c257f421ef434d0fcc407e5ce24ba13d903f3d00f8b9 WatchSource:0}: Error finding container e3ee461c1efee7c385c5c257f421ef434d0fcc407e5ce24ba13d903f3d00f8b9: Status 404 returned error can't find the container with id e3ee461c1efee7c385c5c257f421ef434d0fcc407e5ce24ba13d903f3d00f8b9 Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.532242 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-f25c6"] Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.634195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 06:04:46 crc kubenswrapper[4707]: I0218 06:04:46.855007 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-m9gbr"] Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.318888 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerID="f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba" exitCode=0 Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.318971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" event={"ID":"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2","Type":"ContainerDied","Data":"f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba"} Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.319161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" event={"ID":"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2","Type":"ContainerStarted","Data":"e3ee461c1efee7c385c5c257f421ef434d0fcc407e5ce24ba13d903f3d00f8b9"} Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.323023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-m2jp7" event={"ID":"866d1055-f899-4a65-a353-366bf3a303bf","Type":"ContainerStarted","Data":"ca3bf75a1fa97fe4515b38bf3a2f61947e62d17ff9c58a559c557b5f8f6f501c"} Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.323059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-m2jp7" event={"ID":"866d1055-f899-4a65-a353-366bf3a303bf","Type":"ContainerStarted","Data":"8ade83c0e94b55641257c6326bdc6dc2b248655cbb1d323d352079e3629c5fc7"} Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.324554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"250e525d-abb2-4374-89d4-3b16602fc351","Type":"ContainerStarted","Data":"f5f4fa4d9bfe81a8a66a47737be8eb0e544fb845ea07199c59891c8486b72584"} Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.325549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-m9gbr" event={"ID":"456e788f-7f0e-429e-8049-e023596ef19b","Type":"ContainerStarted","Data":"49d4b7cb85615b630b898934d1f253a6ca59d257f84f3139aca7655fb86de4c0"} Feb 18 06:04:47 crc kubenswrapper[4707]: I0218 06:04:47.365863 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-m2jp7" podStartSLOduration=2.365841742 podStartE2EDuration="2.365841742s" podCreationTimestamp="2026-02-18 06:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:47.363426537 +0000 UTC m=+1024.011385681" watchObservedRunningTime="2026-02-18 06:04:47.365841742 +0000 UTC m=+1024.013800876" Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.070054 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7dd0a9-7e95-4705-bee7-074d6862f9de" path="/var/lib/kubelet/pods/cb7dd0a9-7e95-4705-bee7-074d6862f9de/volumes" Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.071254 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c" path="/var/lib/kubelet/pods/e0e0f0e7-ae4c-44ba-9eb9-6d0f7073662c/volumes" Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.334561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" event={"ID":"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2","Type":"ContainerStarted","Data":"95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546"} Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.334648 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.336211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"28c5a172-7c7d-407a-b727-0f982f82680c","Type":"ContainerStarted","Data":"69d653e8a28b69177271a26ab18dcd6c368a10d430d861d5451027642aa52efd"} Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.338037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"250e525d-abb2-4374-89d4-3b16602fc351","Type":"ContainerStarted","Data":"1748577d9a6307ce3a14eba8e4fd3584935ad088762e8a205b14bcc21b362518"} Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.338080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"250e525d-abb2-4374-89d4-3b16602fc351","Type":"ContainerStarted","Data":"85f9f4090de1df9ca7602602d1a2ceaae53faeef8281f3da6042d2abb8f99c8b"} Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.338186 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.339507 4707 generic.go:334] "Generic (PLEG): container finished" podID="456e788f-7f0e-429e-8049-e023596ef19b" containerID="c1eb60ee387c8782ff8ef08dc4c1c48d2927007dd0875469200661e9ba754adb" exitCode=0 Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.339542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-m9gbr" event={"ID":"456e788f-7f0e-429e-8049-e023596ef19b","Type":"ContainerDied","Data":"c1eb60ee387c8782ff8ef08dc4c1c48d2927007dd0875469200661e9ba754adb"} Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.352202 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" podStartSLOduration=2.9886917950000003 podStartE2EDuration="3.3521869s" podCreationTimestamp="2026-02-18 06:04:45 +0000 UTC" firstStartedPulling="2026-02-18 06:04:46.53890101 +0000 UTC m=+1023.186860144" lastFinishedPulling="2026-02-18 06:04:46.902396115 +0000 UTC m=+1023.550355249" observedRunningTime="2026-02-18 06:04:48.35180986 +0000 UTC m=+1024.999769004" watchObservedRunningTime="2026-02-18 06:04:48.3521869 +0000 UTC m=+1025.000146034" Feb 18 06:04:48 crc kubenswrapper[4707]: I0218 06:04:48.432629 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.5174144590000003 podStartE2EDuration="3.432609909s" podCreationTimestamp="2026-02-18 06:04:45 +0000 UTC" firstStartedPulling="2026-02-18 06:04:46.644892615 +0000 UTC m=+1023.292851749" lastFinishedPulling="2026-02-18 06:04:47.560088055 +0000 UTC m=+1024.208047199" observedRunningTime="2026-02-18 06:04:48.420693809 +0000 UTC m=+1025.068652943" watchObservedRunningTime="2026-02-18 06:04:48.432609909 +0000 UTC m=+1025.080569033" Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.346748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8","Type":"ContainerStarted","Data":"4ad993acde71ea67a58a6ee279e9abc445bf07aaeb61845ca7ea17c84f0b15ed"} Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.348826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-m9gbr" event={"ID":"456e788f-7f0e-429e-8049-e023596ef19b","Type":"ContainerStarted","Data":"e7b0396c9755caeb1b5fb02d292d016e76b0c80b2e0a7393ab1998d534ab833f"} Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.348907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.350906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4a3a1b52-c364-480e-a60b-8bc313f3002d","Type":"ContainerStarted","Data":"0514b179bfd519cad5d0c3c04bec0d9ccb06c4c23c65f3b56ae9686e6122d177"} Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.351262 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.385985 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.907636553 podStartE2EDuration="43.385963043s" podCreationTimestamp="2026-02-18 06:04:06 +0000 UTC" firstStartedPulling="2026-02-18 06:04:08.034886342 +0000 UTC m=+984.682845476" lastFinishedPulling="2026-02-18 06:04:48.513212832 +0000 UTC m=+1025.161171966" observedRunningTime="2026-02-18 06:04:49.385182181 +0000 UTC m=+1026.033141315" watchObservedRunningTime="2026-02-18 06:04:49.385963043 +0000 UTC m=+1026.033922177" Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.407612 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-m9gbr" podStartSLOduration=3.777613917 podStartE2EDuration="4.407594413s" podCreationTimestamp="2026-02-18 06:04:45 +0000 UTC" firstStartedPulling="2026-02-18 06:04:46.858445126 +0000 UTC m=+1023.506404260" lastFinishedPulling="2026-02-18 06:04:47.488425632 +0000 UTC m=+1024.136384756" observedRunningTime="2026-02-18 06:04:49.405087695 +0000 UTC m=+1026.053046829" watchObservedRunningTime="2026-02-18 06:04:49.407594413 +0000 UTC m=+1026.055553537" Feb 18 06:04:49 crc kubenswrapper[4707]: I0218 06:04:49.748715 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 06:04:51 crc kubenswrapper[4707]: I0218 06:04:51.368954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"298a4b48-6611-4cb4-8ccf-e9a00c23622b","Type":"ContainerStarted","Data":"cd5038c856e6b32cf4df33653ad2d71f042ff62ccb85562b6962da27b37269dd"} Feb 18 06:04:51 crc kubenswrapper[4707]: I0218 06:04:51.371941 4707 generic.go:334] "Generic (PLEG): container finished" podID="28c5a172-7c7d-407a-b727-0f982f82680c" containerID="69d653e8a28b69177271a26ab18dcd6c368a10d430d861d5451027642aa52efd" exitCode=0 Feb 18 06:04:51 crc kubenswrapper[4707]: I0218 06:04:51.372020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"28c5a172-7c7d-407a-b727-0f982f82680c","Type":"ContainerDied","Data":"69d653e8a28b69177271a26ab18dcd6c368a10d430d861d5451027642aa52efd"} Feb 18 06:04:52 crc kubenswrapper[4707]: I0218 06:04:52.381199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"28c5a172-7c7d-407a-b727-0f982f82680c","Type":"ContainerStarted","Data":"855f5e944c11483a87fcacd8840570fbd1e7ba1bc0ba8914a7942ac948bbb64c"} Feb 18 06:04:52 crc kubenswrapper[4707]: I0218 06:04:52.403495 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.517594367 podStartE2EDuration="47.403473828s" podCreationTimestamp="2026-02-18 06:04:05 +0000 UTC" firstStartedPulling="2026-02-18 06:04:07.674182263 +0000 UTC m=+984.322141397" lastFinishedPulling="2026-02-18 06:04:47.560061724 +0000 UTC m=+1024.208020858" observedRunningTime="2026-02-18 06:04:52.398848745 +0000 UTC m=+1029.046807889" watchObservedRunningTime="2026-02-18 06:04:52.403473828 +0000 UTC m=+1029.051432962" Feb 18 06:04:53 crc kubenswrapper[4707]: I0218 06:04:53.389165 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ee6297b-9af9-40fd-90e0-edcb0c08f6e8" containerID="4ad993acde71ea67a58a6ee279e9abc445bf07aaeb61845ca7ea17c84f0b15ed" exitCode=0 Feb 18 06:04:53 crc kubenswrapper[4707]: I0218 06:04:53.389255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8","Type":"ContainerDied","Data":"4ad993acde71ea67a58a6ee279e9abc445bf07aaeb61845ca7ea17c84f0b15ed"} Feb 18 06:04:54 crc kubenswrapper[4707]: I0218 06:04:54.396444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ee6297b-9af9-40fd-90e0-edcb0c08f6e8","Type":"ContainerStarted","Data":"a55bedfae158055f61aaa637f3e4f43b562ec9d4fd15b0fe001bc6102a5adfe0"} Feb 18 06:04:54 crc kubenswrapper[4707]: I0218 06:04:54.432113 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371986.422688 podStartE2EDuration="50.432087377s" podCreationTimestamp="2026-02-18 06:04:04 +0000 UTC" firstStartedPulling="2026-02-18 06:04:06.271502832 +0000 UTC m=+982.919461966" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:04:54.4232524 +0000 UTC m=+1031.071211534" watchObservedRunningTime="2026-02-18 06:04:54.432087377 +0000 UTC m=+1031.080046511" Feb 18 06:04:55 crc kubenswrapper[4707]: I0218 06:04:55.528984 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 18 06:04:55 crc kubenswrapper[4707]: I0218 06:04:55.529380 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 18 06:04:55 crc kubenswrapper[4707]: E0218 06:04:55.538878 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.17:39388->38.102.83.17:43371: read tcp 38.102.83.17:39388->38.102.83.17:43371: read: connection reset by peer Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.047713 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.349951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.403744 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-f25c6"] Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.409643 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" podUID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerName="dnsmasq-dns" containerID="cri-o://95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546" gracePeriod=10 Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.864312 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.893380 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.893815 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.963032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kffsx\" (UniqueName: \"kubernetes.io/projected/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-kube-api-access-kffsx\") pod \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.963099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-ovsdbserver-nb\") pod \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.963216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-config\") pod \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.963331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-dns-svc\") pod \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\" (UID: \"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2\") " Feb 18 06:04:56 crc kubenswrapper[4707]: I0218 06:04:56.974115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-kube-api-access-kffsx" (OuterVolumeSpecName: "kube-api-access-kffsx") pod "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" (UID: "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2"). InnerVolumeSpecName "kube-api-access-kffsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:56.999968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.007359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" (UID: "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.007771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" (UID: "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.022692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-config" (OuterVolumeSpecName: "config") pod "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" (UID: "b9eeba54-d0d6-4fe5-8154-58afdd82f8a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.066042 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kffsx\" (UniqueName: \"kubernetes.io/projected/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-kube-api-access-kffsx\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.066072 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.066083 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.066092 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.270710 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.418509 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerID="95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546" exitCode=0 Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.418587 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.418625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" event={"ID":"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2","Type":"ContainerDied","Data":"95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546"} Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.418995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-f25c6" event={"ID":"b9eeba54-d0d6-4fe5-8154-58afdd82f8a2","Type":"ContainerDied","Data":"e3ee461c1efee7c385c5c257f421ef434d0fcc407e5ce24ba13d903f3d00f8b9"} Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.419033 4707 scope.go:117] "RemoveContainer" containerID="95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.464941 4707 scope.go:117] "RemoveContainer" containerID="f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.465320 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-f25c6"] Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.467424 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-f25c6"] Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.502215 4707 scope.go:117] "RemoveContainer" containerID="95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546" Feb 18 06:04:57 crc kubenswrapper[4707]: E0218 06:04:57.502676 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546\": container with ID starting with 95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546 not found: ID does not exist" containerID="95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.502841 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546"} err="failed to get container status \"95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546\": rpc error: code = NotFound desc = could not find container \"95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546\": container with ID starting with 95d6f106eee1b9d09f9f69e98a94262f9c153975ebbcdc2c1939a7722436d546 not found: ID does not exist" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.502957 4707 scope.go:117] "RemoveContainer" containerID="f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba" Feb 18 06:04:57 crc kubenswrapper[4707]: E0218 06:04:57.503541 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba\": container with ID starting with f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba not found: ID does not exist" containerID="f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.503581 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba"} err="failed to get container status \"f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba\": rpc error: code = NotFound desc = could not find container \"f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba\": container with ID starting with f7ae1ce9e3f4dc315b04d2bb7567b2b5ce12677a9828789dc98598df58b94aba not found: ID does not exist" Feb 18 06:04:57 crc kubenswrapper[4707]: I0218 06:04:57.603332 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 18 06:04:58 crc kubenswrapper[4707]: I0218 06:04:58.065556 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" path="/var/lib/kubelet/pods/b9eeba54-d0d6-4fe5-8154-58afdd82f8a2/volumes" Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.666124 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.901104 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bfwr6"] Feb 18 06:04:59 crc kubenswrapper[4707]: E0218 06:04:59.901633 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerName="dnsmasq-dns" Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.901655 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerName="dnsmasq-dns" Feb 18 06:04:59 crc kubenswrapper[4707]: E0218 06:04:59.901717 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerName="init" Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.901725 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerName="init" Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.901925 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eeba54-d0d6-4fe5-8154-58afdd82f8a2" containerName="dnsmasq-dns" Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.903051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.909846 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bfwr6"] Feb 18 06:04:59 crc kubenswrapper[4707]: I0218 06:04:59.968150 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.017254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-config\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.017370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.017503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlrb\" (UniqueName: \"kubernetes.io/projected/d14db420-fcd2-4cc6-b14e-0b75560e3207-kube-api-access-qrlrb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.017602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.017654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.119080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlrb\" (UniqueName: \"kubernetes.io/projected/d14db420-fcd2-4cc6-b14e-0b75560e3207-kube-api-access-qrlrb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.119140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.119180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.119219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-config\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.119237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.120693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.120863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.120862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.120732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-config\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.155543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlrb\" (UniqueName: \"kubernetes.io/projected/d14db420-fcd2-4cc6-b14e-0b75560e3207-kube-api-access-qrlrb\") pod \"dnsmasq-dns-b8fbc5445-bfwr6\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.219009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.786817 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bfwr6"] Feb 18 06:05:00 crc kubenswrapper[4707]: W0218 06:05:00.807094 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd14db420_fcd2_4cc6_b14e_0b75560e3207.slice/crio-5a474bf4657e1233ac180301a059610b7989a1eefed7e11233c5d6db9f8b33e6 WatchSource:0}: Error finding container 5a474bf4657e1233ac180301a059610b7989a1eefed7e11233c5d6db9f8b33e6: Status 404 returned error can't find the container with id 5a474bf4657e1233ac180301a059610b7989a1eefed7e11233c5d6db9f8b33e6 Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.993529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 18 06:05:00 crc kubenswrapper[4707]: I0218 06:05:00.999273 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.002698 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.002738 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-ztmr2" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.002738 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.002965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.059591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.168494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.169156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5253fac6-1dd5-48c7-853a-f7cfa41840fa-cache\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.169185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5253fac6-1dd5-48c7-853a-f7cfa41840fa-lock\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.169209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.169232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5253fac6-1dd5-48c7-853a-f7cfa41840fa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.169291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpv45\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-kube-api-access-fpv45\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.271727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5253fac6-1dd5-48c7-853a-f7cfa41840fa-cache\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.271886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5253fac6-1dd5-48c7-853a-f7cfa41840fa-lock\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.271935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.271980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5253fac6-1dd5-48c7-853a-f7cfa41840fa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.272089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpv45\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-kube-api-access-fpv45\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: E0218 06:05:01.272158 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:05:01 crc kubenswrapper[4707]: E0218 06:05:01.272198 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:05:01 crc kubenswrapper[4707]: E0218 06:05:01.272262 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift podName:5253fac6-1dd5-48c7-853a-f7cfa41840fa nodeName:}" failed. No retries permitted until 2026-02-18 06:05:01.772241894 +0000 UTC m=+1038.420201018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift") pod "swift-storage-0" (UID: "5253fac6-1dd5-48c7-853a-f7cfa41840fa") : configmap "swift-ring-files" not found Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.272176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.272498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5253fac6-1dd5-48c7-853a-f7cfa41840fa-cache\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.272521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5253fac6-1dd5-48c7-853a-f7cfa41840fa-lock\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.272892 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.279906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5253fac6-1dd5-48c7-853a-f7cfa41840fa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.291263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpv45\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-kube-api-access-fpv45\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.306890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.461899 4707 generic.go:334] "Generic (PLEG): container finished" podID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerID="f37ce5589e05030213c59c072103c26e7e8b8c37db00688e5e4a03a714f05fe5" exitCode=0 Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.462002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" event={"ID":"d14db420-fcd2-4cc6-b14e-0b75560e3207","Type":"ContainerDied","Data":"f37ce5589e05030213c59c072103c26e7e8b8c37db00688e5e4a03a714f05fe5"} Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.462268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" event={"ID":"d14db420-fcd2-4cc6-b14e-0b75560e3207","Type":"ContainerStarted","Data":"5a474bf4657e1233ac180301a059610b7989a1eefed7e11233c5d6db9f8b33e6"} Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.663226 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rbttv"] Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.664295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.669216 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.670039 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.670162 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.692106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rbttv"] Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-ring-data-devices\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-combined-ca-bundle\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-swiftconf\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdtlc\" (UniqueName: \"kubernetes.io/projected/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-kube-api-access-hdtlc\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787496 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-etc-swift\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-dispersionconf\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-scripts\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.787744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:01 crc kubenswrapper[4707]: E0218 06:05:01.787945 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:05:01 crc kubenswrapper[4707]: E0218 06:05:01.787972 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:05:01 crc kubenswrapper[4707]: E0218 06:05:01.788033 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift podName:5253fac6-1dd5-48c7-853a-f7cfa41840fa nodeName:}" failed. No retries permitted until 2026-02-18 06:05:02.788009846 +0000 UTC m=+1039.435968990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift") pod "swift-storage-0" (UID: "5253fac6-1dd5-48c7-853a-f7cfa41840fa") : configmap "swift-ring-files" not found Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.890002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-combined-ca-bundle\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.890600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-swiftconf\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.890649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdtlc\" (UniqueName: \"kubernetes.io/projected/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-kube-api-access-hdtlc\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.890699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-etc-swift\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.890763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-dispersionconf\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.890870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-scripts\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.891088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-ring-data-devices\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.891446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-etc-swift\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.892309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-scripts\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.892348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-ring-data-devices\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.894891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-swiftconf\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.894983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-combined-ca-bundle\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.895653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-dispersionconf\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.908103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdtlc\" (UniqueName: \"kubernetes.io/projected/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-kube-api-access-hdtlc\") pod \"swift-ring-rebalance-rbttv\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:01 crc kubenswrapper[4707]: I0218 06:05:01.997087 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:02 crc kubenswrapper[4707]: I0218 06:05:02.471848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" event={"ID":"d14db420-fcd2-4cc6-b14e-0b75560e3207","Type":"ContainerStarted","Data":"80fce60381ac9e24e7f596b7f229e4e673bb85287f11c60e3520be80c063d434"} Feb 18 06:05:02 crc kubenswrapper[4707]: I0218 06:05:02.472098 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:02 crc kubenswrapper[4707]: I0218 06:05:02.498751 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" podStartSLOduration=3.498731228 podStartE2EDuration="3.498731228s" podCreationTimestamp="2026-02-18 06:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:02.488713539 +0000 UTC m=+1039.136672673" watchObservedRunningTime="2026-02-18 06:05:02.498731228 +0000 UTC m=+1039.146690372" Feb 18 06:05:02 crc kubenswrapper[4707]: I0218 06:05:02.506964 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rbttv"] Feb 18 06:05:02 crc kubenswrapper[4707]: I0218 06:05:02.808910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:02 crc kubenswrapper[4707]: E0218 06:05:02.809141 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:05:02 crc kubenswrapper[4707]: E0218 06:05:02.809181 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:05:02 crc kubenswrapper[4707]: E0218 06:05:02.809249 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift podName:5253fac6-1dd5-48c7-853a-f7cfa41840fa nodeName:}" failed. No retries permitted until 2026-02-18 06:05:04.80922843 +0000 UTC m=+1041.457187564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift") pod "swift-storage-0" (UID: "5253fac6-1dd5-48c7-853a-f7cfa41840fa") : configmap "swift-ring-files" not found Feb 18 06:05:03 crc kubenswrapper[4707]: I0218 06:05:03.481313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rbttv" event={"ID":"4ff550e2-53ae-4f38-98d1-e95da8f7bde6","Type":"ContainerStarted","Data":"ee91f398a3b8c510e1391fecef3f73a1fef9280a1fe1834dce5994f734652eef"} Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.246699 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-28xrn"] Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.248029 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.251001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.264242 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-28xrn"] Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.338362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee9488a-9a13-42d3-bedd-fa39647dd767-operator-scripts\") pod \"root-account-create-update-28xrn\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.338445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hg88\" (UniqueName: \"kubernetes.io/projected/aee9488a-9a13-42d3-bedd-fa39647dd767-kube-api-access-8hg88\") pod \"root-account-create-update-28xrn\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.440527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee9488a-9a13-42d3-bedd-fa39647dd767-operator-scripts\") pod \"root-account-create-update-28xrn\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.440577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hg88\" (UniqueName: \"kubernetes.io/projected/aee9488a-9a13-42d3-bedd-fa39647dd767-kube-api-access-8hg88\") pod \"root-account-create-update-28xrn\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.441967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee9488a-9a13-42d3-bedd-fa39647dd767-operator-scripts\") pod \"root-account-create-update-28xrn\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.461535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hg88\" (UniqueName: \"kubernetes.io/projected/aee9488a-9a13-42d3-bedd-fa39647dd767-kube-api-access-8hg88\") pod \"root-account-create-update-28xrn\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.571961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:04 crc kubenswrapper[4707]: I0218 06:05:04.852741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:04 crc kubenswrapper[4707]: E0218 06:05:04.853030 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:05:04 crc kubenswrapper[4707]: E0218 06:05:04.853451 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:05:04 crc kubenswrapper[4707]: E0218 06:05:04.853517 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift podName:5253fac6-1dd5-48c7-853a-f7cfa41840fa nodeName:}" failed. No retries permitted until 2026-02-18 06:05:08.85349494 +0000 UTC m=+1045.501454064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift") pod "swift-storage-0" (UID: "5253fac6-1dd5-48c7-853a-f7cfa41840fa") : configmap "swift-ring-files" not found Feb 18 06:05:05 crc kubenswrapper[4707]: I0218 06:05:05.075627 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-28xrn"] Feb 18 06:05:05 crc kubenswrapper[4707]: W0218 06:05:05.082215 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee9488a_9a13_42d3_bedd_fa39647dd767.slice/crio-8c5c761dc1cf3797763d345ed3be939aab1171d0836316b5f6a398b619669e85 WatchSource:0}: Error finding container 8c5c761dc1cf3797763d345ed3be939aab1171d0836316b5f6a398b619669e85: Status 404 returned error can't find the container with id 8c5c761dc1cf3797763d345ed3be939aab1171d0836316b5f6a398b619669e85 Feb 18 06:05:05 crc kubenswrapper[4707]: I0218 06:05:05.503245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-28xrn" event={"ID":"aee9488a-9a13-42d3-bedd-fa39647dd767","Type":"ContainerStarted","Data":"2d923c60f672c03514692cd1e760948e35d5cd6c66436215cf7fd41b05ae775e"} Feb 18 06:05:05 crc kubenswrapper[4707]: I0218 06:05:05.503317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-28xrn" event={"ID":"aee9488a-9a13-42d3-bedd-fa39647dd767","Type":"ContainerStarted","Data":"8c5c761dc1cf3797763d345ed3be939aab1171d0836316b5f6a398b619669e85"} Feb 18 06:05:05 crc kubenswrapper[4707]: I0218 06:05:05.526540 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-28xrn" podStartSLOduration=1.52649221 podStartE2EDuration="1.52649221s" podCreationTimestamp="2026-02-18 06:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:05.522183045 +0000 UTC m=+1042.170142199" watchObservedRunningTime="2026-02-18 06:05:05.52649221 +0000 UTC m=+1042.174451354" Feb 18 06:05:06 crc kubenswrapper[4707]: I0218 06:05:06.191839 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 18 06:05:06 crc kubenswrapper[4707]: I0218 06:05:06.514121 4707 generic.go:334] "Generic (PLEG): container finished" podID="aee9488a-9a13-42d3-bedd-fa39647dd767" containerID="2d923c60f672c03514692cd1e760948e35d5cd6c66436215cf7fd41b05ae775e" exitCode=0 Feb 18 06:05:06 crc kubenswrapper[4707]: I0218 06:05:06.514166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-28xrn" event={"ID":"aee9488a-9a13-42d3-bedd-fa39647dd767","Type":"ContainerDied","Data":"2d923c60f672c03514692cd1e760948e35d5cd6c66436215cf7fd41b05ae775e"} Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.400827 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wf8p8"] Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.402450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.427541 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wf8p8"] Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.510955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ffc590-9556-4e8b-9faf-ed5df3a747a8-operator-scripts\") pod \"glance-db-create-wf8p8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.511024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh7l\" (UniqueName: \"kubernetes.io/projected/80ffc590-9556-4e8b-9faf-ed5df3a747a8-kube-api-access-xwh7l\") pod \"glance-db-create-wf8p8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.536934 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8c09-account-create-update-f5p6n"] Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.538036 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.544869 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.551021 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8c09-account-create-update-f5p6n"] Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.615948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh7l\" (UniqueName: \"kubernetes.io/projected/80ffc590-9556-4e8b-9faf-ed5df3a747a8-kube-api-access-xwh7l\") pod \"glance-db-create-wf8p8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.616443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ffc590-9556-4e8b-9faf-ed5df3a747a8-operator-scripts\") pod \"glance-db-create-wf8p8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.617862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ffc590-9556-4e8b-9faf-ed5df3a747a8-operator-scripts\") pod \"glance-db-create-wf8p8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.639773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh7l\" (UniqueName: \"kubernetes.io/projected/80ffc590-9556-4e8b-9faf-ed5df3a747a8-kube-api-access-xwh7l\") pod \"glance-db-create-wf8p8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.726935 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.728561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8496e6be-0819-4878-a823-31f90c5fd272-operator-scripts\") pod \"glance-8c09-account-create-update-f5p6n\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.728716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xxld\" (UniqueName: \"kubernetes.io/projected/8496e6be-0819-4878-a823-31f90c5fd272-kube-api-access-9xxld\") pod \"glance-8c09-account-create-update-f5p6n\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.831006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8496e6be-0819-4878-a823-31f90c5fd272-operator-scripts\") pod \"glance-8c09-account-create-update-f5p6n\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.831440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xxld\" (UniqueName: \"kubernetes.io/projected/8496e6be-0819-4878-a823-31f90c5fd272-kube-api-access-9xxld\") pod \"glance-8c09-account-create-update-f5p6n\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.833302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8496e6be-0819-4878-a823-31f90c5fd272-operator-scripts\") pod \"glance-8c09-account-create-update-f5p6n\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.848539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xxld\" (UniqueName: \"kubernetes.io/projected/8496e6be-0819-4878-a823-31f90c5fd272-kube-api-access-9xxld\") pod \"glance-8c09-account-create-update-f5p6n\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:07 crc kubenswrapper[4707]: I0218 06:05:07.859097 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.180699 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-898jt"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.183490 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.192202 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-898jt"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.279847 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6da2-account-create-update-h65t8"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.281336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.284417 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.287732 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6da2-account-create-update-h65t8"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.342449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1439301a-c008-4af2-bb69-6857397051f3-operator-scripts\") pod \"keystone-db-create-898jt\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.342561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmbrc\" (UniqueName: \"kubernetes.io/projected/1439301a-c008-4af2-bb69-6857397051f3-kube-api-access-wmbrc\") pod \"keystone-db-create-898jt\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.421086 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-w556k"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.422223 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.427130 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w556k"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.447598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b042550-a34c-44f7-9a49-29bb4f865dbd-operator-scripts\") pod \"keystone-6da2-account-create-update-h65t8\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.448050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmbrc\" (UniqueName: \"kubernetes.io/projected/1439301a-c008-4af2-bb69-6857397051f3-kube-api-access-wmbrc\") pod \"keystone-db-create-898jt\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.448219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4skck\" (UniqueName: \"kubernetes.io/projected/0b042550-a34c-44f7-9a49-29bb4f865dbd-kube-api-access-4skck\") pod \"keystone-6da2-account-create-update-h65t8\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.448600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1439301a-c008-4af2-bb69-6857397051f3-operator-scripts\") pod \"keystone-db-create-898jt\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.450256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1439301a-c008-4af2-bb69-6857397051f3-operator-scripts\") pod \"keystone-db-create-898jt\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.468139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmbrc\" (UniqueName: \"kubernetes.io/projected/1439301a-c008-4af2-bb69-6857397051f3-kube-api-access-wmbrc\") pod \"keystone-db-create-898jt\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.510491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-898jt" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.543153 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9bf7-account-create-update-trgcj"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.544594 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.547844 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.549685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4skck\" (UniqueName: \"kubernetes.io/projected/0b042550-a34c-44f7-9a49-29bb4f865dbd-kube-api-access-4skck\") pod \"keystone-6da2-account-create-update-h65t8\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.549841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f42416-5e4a-475b-862e-71cb10661178-operator-scripts\") pod \"placement-db-create-w556k\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.549899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b042550-a34c-44f7-9a49-29bb4f865dbd-operator-scripts\") pod \"keystone-6da2-account-create-update-h65t8\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.549942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfktn\" (UniqueName: \"kubernetes.io/projected/39f42416-5e4a-475b-862e-71cb10661178-kube-api-access-bfktn\") pod \"placement-db-create-w556k\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.551831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b042550-a34c-44f7-9a49-29bb4f865dbd-operator-scripts\") pod \"keystone-6da2-account-create-update-h65t8\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.558700 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9bf7-account-create-update-trgcj"] Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.579192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4skck\" (UniqueName: \"kubernetes.io/projected/0b042550-a34c-44f7-9a49-29bb4f865dbd-kube-api-access-4skck\") pod \"keystone-6da2-account-create-update-h65t8\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.601381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.642731 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.653098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f42416-5e4a-475b-862e-71cb10661178-operator-scripts\") pod \"placement-db-create-w556k\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.653192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfktn\" (UniqueName: \"kubernetes.io/projected/39f42416-5e4a-475b-862e-71cb10661178-kube-api-access-bfktn\") pod \"placement-db-create-w556k\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.653228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2b8k\" (UniqueName: \"kubernetes.io/projected/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-kube-api-access-p2b8k\") pod \"placement-9bf7-account-create-update-trgcj\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.653290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-operator-scripts\") pod \"placement-9bf7-account-create-update-trgcj\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.653939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f42416-5e4a-475b-862e-71cb10661178-operator-scripts\") pod \"placement-db-create-w556k\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.676331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfktn\" (UniqueName: \"kubernetes.io/projected/39f42416-5e4a-475b-862e-71cb10661178-kube-api-access-bfktn\") pod \"placement-db-create-w556k\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.754670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hg88\" (UniqueName: \"kubernetes.io/projected/aee9488a-9a13-42d3-bedd-fa39647dd767-kube-api-access-8hg88\") pod \"aee9488a-9a13-42d3-bedd-fa39647dd767\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.755079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee9488a-9a13-42d3-bedd-fa39647dd767-operator-scripts\") pod \"aee9488a-9a13-42d3-bedd-fa39647dd767\" (UID: \"aee9488a-9a13-42d3-bedd-fa39647dd767\") " Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.755360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2b8k\" (UniqueName: \"kubernetes.io/projected/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-kube-api-access-p2b8k\") pod \"placement-9bf7-account-create-update-trgcj\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.755405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-operator-scripts\") pod \"placement-9bf7-account-create-update-trgcj\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.756162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee9488a-9a13-42d3-bedd-fa39647dd767-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aee9488a-9a13-42d3-bedd-fa39647dd767" (UID: "aee9488a-9a13-42d3-bedd-fa39647dd767"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.756805 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-operator-scripts\") pod \"placement-9bf7-account-create-update-trgcj\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.760827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee9488a-9a13-42d3-bedd-fa39647dd767-kube-api-access-8hg88" (OuterVolumeSpecName: "kube-api-access-8hg88") pod "aee9488a-9a13-42d3-bedd-fa39647dd767" (UID: "aee9488a-9a13-42d3-bedd-fa39647dd767"). InnerVolumeSpecName "kube-api-access-8hg88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.766091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w556k" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.775310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2b8k\" (UniqueName: \"kubernetes.io/projected/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-kube-api-access-p2b8k\") pod \"placement-9bf7-account-create-update-trgcj\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.857820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.857933 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee9488a-9a13-42d3-bedd-fa39647dd767-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.857949 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hg88\" (UniqueName: \"kubernetes.io/projected/aee9488a-9a13-42d3-bedd-fa39647dd767-kube-api-access-8hg88\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:08 crc kubenswrapper[4707]: E0218 06:05:08.858064 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 06:05:08 crc kubenswrapper[4707]: E0218 06:05:08.858085 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 06:05:08 crc kubenswrapper[4707]: E0218 06:05:08.858159 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift podName:5253fac6-1dd5-48c7-853a-f7cfa41840fa nodeName:}" failed. No retries permitted until 2026-02-18 06:05:16.858141489 +0000 UTC m=+1053.506100613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift") pod "swift-storage-0" (UID: "5253fac6-1dd5-48c7-853a-f7cfa41840fa") : configmap "swift-ring-files" not found Feb 18 06:05:08 crc kubenswrapper[4707]: I0218 06:05:08.867771 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.145651 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-898jt"] Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.243105 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8c09-account-create-update-f5p6n"] Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.248378 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wf8p8"] Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.348488 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6da2-account-create-update-h65t8"] Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.354501 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w556k"] Feb 18 06:05:09 crc kubenswrapper[4707]: W0218 06:05:09.360430 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b042550_a34c_44f7_9a49_29bb4f865dbd.slice/crio-55bc9ca675fb22e20c0a1fef5605a1b9f561d1e393a919099726b98774530074 WatchSource:0}: Error finding container 55bc9ca675fb22e20c0a1fef5605a1b9f561d1e393a919099726b98774530074: Status 404 returned error can't find the container with id 55bc9ca675fb22e20c0a1fef5605a1b9f561d1e393a919099726b98774530074 Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.475426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9bf7-account-create-update-trgcj"] Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.559746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c09-account-create-update-f5p6n" event={"ID":"8496e6be-0819-4878-a823-31f90c5fd272","Type":"ContainerStarted","Data":"d5e9efaaa534d435389bcc215327bbea476ca3c9b3fa37945bf2de3248f0a6f8"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.560218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c09-account-create-update-f5p6n" event={"ID":"8496e6be-0819-4878-a823-31f90c5fd272","Type":"ContainerStarted","Data":"6975e10dc8323ccabeaa2274c4561d67a168dbb2f105027341828843f60774b0"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.561461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9bf7-account-create-update-trgcj" event={"ID":"d0fe62a2-3b67-40cf-8d6c-fd68f9667276","Type":"ContainerStarted","Data":"46c2da10b43d5db45bb14dbd5155556cd348053c923af10deb68cf50350d12bb"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.563901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rbttv" event={"ID":"4ff550e2-53ae-4f38-98d1-e95da8f7bde6","Type":"ContainerStarted","Data":"dc4464d0b69c7de0fdb818e8441d8b46e124b5ac6a0d625c606fe481cbd6701c"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.567973 4707 generic.go:334] "Generic (PLEG): container finished" podID="1439301a-c008-4af2-bb69-6857397051f3" containerID="4cda4ee41b31596d77a33c53f558aab4c37e2a475cfb4eb9cf6fceace1caa048" exitCode=0 Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.568082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-898jt" event={"ID":"1439301a-c008-4af2-bb69-6857397051f3","Type":"ContainerDied","Data":"4cda4ee41b31596d77a33c53f558aab4c37e2a475cfb4eb9cf6fceace1caa048"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.568102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-898jt" event={"ID":"1439301a-c008-4af2-bb69-6857397051f3","Type":"ContainerStarted","Data":"d44fe968238a7d2e4f6de6adfb38373c1c63a4670f3c293e4938a09a8722427b"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.570323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w556k" event={"ID":"39f42416-5e4a-475b-862e-71cb10661178","Type":"ContainerStarted","Data":"7e661fde4df97830aca9914df1d357493866ccfdb1951939d2fd404fe4456c62"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.570386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w556k" event={"ID":"39f42416-5e4a-475b-862e-71cb10661178","Type":"ContainerStarted","Data":"86666ec0321c8446e7695b0e9edb392e305dbcb7d603703a179d726d0f2ae2c4"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.594041 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-28xrn" Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.595056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-28xrn" event={"ID":"aee9488a-9a13-42d3-bedd-fa39647dd767","Type":"ContainerDied","Data":"8c5c761dc1cf3797763d345ed3be939aab1171d0836316b5f6a398b619669e85"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.596292 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c5c761dc1cf3797763d345ed3be939aab1171d0836316b5f6a398b619669e85" Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.600237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wf8p8" event={"ID":"80ffc590-9556-4e8b-9faf-ed5df3a747a8","Type":"ContainerStarted","Data":"3becbb6e025eeb676be7a213ab27f3ae08405832a9869a73829c02bd1034c8d6"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.600387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wf8p8" event={"ID":"80ffc590-9556-4e8b-9faf-ed5df3a747a8","Type":"ContainerStarted","Data":"0b5f008d92c4f35f3ce400ea3549c83b0a6c2ce92e1aab907ccd083a429c7998"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.602293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6da2-account-create-update-h65t8" event={"ID":"0b042550-a34c-44f7-9a49-29bb4f865dbd","Type":"ContainerStarted","Data":"1426c314ff7a3e186e8d92c82cc7e770a994448844a585fd6b5cec870c3bafcd"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.602327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6da2-account-create-update-h65t8" event={"ID":"0b042550-a34c-44f7-9a49-29bb4f865dbd","Type":"ContainerStarted","Data":"55bc9ca675fb22e20c0a1fef5605a1b9f561d1e393a919099726b98774530074"} Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.606099 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-8c09-account-create-update-f5p6n" podStartSLOduration=2.60608219 podStartE2EDuration="2.60608219s" podCreationTimestamp="2026-02-18 06:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:09.574058545 +0000 UTC m=+1046.222017679" watchObservedRunningTime="2026-02-18 06:05:09.60608219 +0000 UTC m=+1046.254041324" Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.634197 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-w556k" podStartSLOduration=1.63417546 podStartE2EDuration="1.63417546s" podCreationTimestamp="2026-02-18 06:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:09.611120095 +0000 UTC m=+1046.259079229" watchObservedRunningTime="2026-02-18 06:05:09.63417546 +0000 UTC m=+1046.282134594" Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.649297 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rbttv" podStartSLOduration=2.472928671 podStartE2EDuration="8.649275993s" podCreationTimestamp="2026-02-18 06:05:01 +0000 UTC" firstStartedPulling="2026-02-18 06:05:02.507894034 +0000 UTC m=+1039.155853168" lastFinishedPulling="2026-02-18 06:05:08.684241356 +0000 UTC m=+1045.332200490" observedRunningTime="2026-02-18 06:05:09.633096172 +0000 UTC m=+1046.281055326" watchObservedRunningTime="2026-02-18 06:05:09.649275993 +0000 UTC m=+1046.297235127" Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.658004 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6da2-account-create-update-h65t8" podStartSLOduration=1.657978916 podStartE2EDuration="1.657978916s" podCreationTimestamp="2026-02-18 06:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:09.647130687 +0000 UTC m=+1046.295089821" watchObservedRunningTime="2026-02-18 06:05:09.657978916 +0000 UTC m=+1046.305938060" Feb 18 06:05:09 crc kubenswrapper[4707]: I0218 06:05:09.679377 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-wf8p8" podStartSLOduration=2.679357117 podStartE2EDuration="2.679357117s" podCreationTimestamp="2026-02-18 06:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:09.669538695 +0000 UTC m=+1046.317497829" watchObservedRunningTime="2026-02-18 06:05:09.679357117 +0000 UTC m=+1046.327316241" Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.222258 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.290492 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-m9gbr"] Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.290763 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-m9gbr" podUID="456e788f-7f0e-429e-8049-e023596ef19b" containerName="dnsmasq-dns" containerID="cri-o://e7b0396c9755caeb1b5fb02d292d016e76b0c80b2e0a7393ab1998d534ab833f" gracePeriod=10 Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.454957 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-28xrn"] Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.461997 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-28xrn"] Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.611564 4707 generic.go:334] "Generic (PLEG): container finished" podID="d0fe62a2-3b67-40cf-8d6c-fd68f9667276" containerID="e6f3506bdc428e67019473cffb371b7e71fd6387146ddbd9cffc02a585aee599" exitCode=0 Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.611645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9bf7-account-create-update-trgcj" event={"ID":"d0fe62a2-3b67-40cf-8d6c-fd68f9667276","Type":"ContainerDied","Data":"e6f3506bdc428e67019473cffb371b7e71fd6387146ddbd9cffc02a585aee599"} Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.613117 4707 generic.go:334] "Generic (PLEG): container finished" podID="39f42416-5e4a-475b-862e-71cb10661178" containerID="7e661fde4df97830aca9914df1d357493866ccfdb1951939d2fd404fe4456c62" exitCode=0 Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.613161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w556k" event={"ID":"39f42416-5e4a-475b-862e-71cb10661178","Type":"ContainerDied","Data":"7e661fde4df97830aca9914df1d357493866ccfdb1951939d2fd404fe4456c62"} Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.614178 4707 generic.go:334] "Generic (PLEG): container finished" podID="80ffc590-9556-4e8b-9faf-ed5df3a747a8" containerID="3becbb6e025eeb676be7a213ab27f3ae08405832a9869a73829c02bd1034c8d6" exitCode=0 Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.614209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wf8p8" event={"ID":"80ffc590-9556-4e8b-9faf-ed5df3a747a8","Type":"ContainerDied","Data":"3becbb6e025eeb676be7a213ab27f3ae08405832a9869a73829c02bd1034c8d6"} Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.615346 4707 generic.go:334] "Generic (PLEG): container finished" podID="456e788f-7f0e-429e-8049-e023596ef19b" containerID="e7b0396c9755caeb1b5fb02d292d016e76b0c80b2e0a7393ab1998d534ab833f" exitCode=0 Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.615376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-m9gbr" event={"ID":"456e788f-7f0e-429e-8049-e023596ef19b","Type":"ContainerDied","Data":"e7b0396c9755caeb1b5fb02d292d016e76b0c80b2e0a7393ab1998d534ab833f"} Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.616224 4707 generic.go:334] "Generic (PLEG): container finished" podID="0b042550-a34c-44f7-9a49-29bb4f865dbd" containerID="1426c314ff7a3e186e8d92c82cc7e770a994448844a585fd6b5cec870c3bafcd" exitCode=0 Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.616254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6da2-account-create-update-h65t8" event={"ID":"0b042550-a34c-44f7-9a49-29bb4f865dbd","Type":"ContainerDied","Data":"1426c314ff7a3e186e8d92c82cc7e770a994448844a585fd6b5cec870c3bafcd"} Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.617099 4707 generic.go:334] "Generic (PLEG): container finished" podID="8496e6be-0819-4878-a823-31f90c5fd272" containerID="d5e9efaaa534d435389bcc215327bbea476ca3c9b3fa37945bf2de3248f0a6f8" exitCode=0 Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.617827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c09-account-create-update-f5p6n" event={"ID":"8496e6be-0819-4878-a823-31f90c5fd272","Type":"ContainerDied","Data":"d5e9efaaa534d435389bcc215327bbea476ca3c9b3fa37945bf2de3248f0a6f8"} Feb 18 06:05:10 crc kubenswrapper[4707]: I0218 06:05:10.879955 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.008532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt4hx\" (UniqueName: \"kubernetes.io/projected/456e788f-7f0e-429e-8049-e023596ef19b-kube-api-access-gt4hx\") pod \"456e788f-7f0e-429e-8049-e023596ef19b\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.009808 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-sb\") pod \"456e788f-7f0e-429e-8049-e023596ef19b\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.010412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-config\") pod \"456e788f-7f0e-429e-8049-e023596ef19b\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.010531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-nb\") pod \"456e788f-7f0e-429e-8049-e023596ef19b\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.010621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-dns-svc\") pod \"456e788f-7f0e-429e-8049-e023596ef19b\" (UID: \"456e788f-7f0e-429e-8049-e023596ef19b\") " Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.016292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456e788f-7f0e-429e-8049-e023596ef19b-kube-api-access-gt4hx" (OuterVolumeSpecName: "kube-api-access-gt4hx") pod "456e788f-7f0e-429e-8049-e023596ef19b" (UID: "456e788f-7f0e-429e-8049-e023596ef19b"). InnerVolumeSpecName "kube-api-access-gt4hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.050129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "456e788f-7f0e-429e-8049-e023596ef19b" (UID: "456e788f-7f0e-429e-8049-e023596ef19b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.055911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-config" (OuterVolumeSpecName: "config") pod "456e788f-7f0e-429e-8049-e023596ef19b" (UID: "456e788f-7f0e-429e-8049-e023596ef19b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.056862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "456e788f-7f0e-429e-8049-e023596ef19b" (UID: "456e788f-7f0e-429e-8049-e023596ef19b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.057596 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "456e788f-7f0e-429e-8049-e023596ef19b" (UID: "456e788f-7f0e-429e-8049-e023596ef19b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.073449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-898jt" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.112643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1439301a-c008-4af2-bb69-6857397051f3-operator-scripts\") pod \"1439301a-c008-4af2-bb69-6857397051f3\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmbrc\" (UniqueName: \"kubernetes.io/projected/1439301a-c008-4af2-bb69-6857397051f3-kube-api-access-wmbrc\") pod \"1439301a-c008-4af2-bb69-6857397051f3\" (UID: \"1439301a-c008-4af2-bb69-6857397051f3\") " Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1439301a-c008-4af2-bb69-6857397051f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1439301a-c008-4af2-bb69-6857397051f3" (UID: "1439301a-c008-4af2-bb69-6857397051f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113842 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1439301a-c008-4af2-bb69-6857397051f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113868 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt4hx\" (UniqueName: \"kubernetes.io/projected/456e788f-7f0e-429e-8049-e023596ef19b-kube-api-access-gt4hx\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113881 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113899 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113909 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.113918 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456e788f-7f0e-429e-8049-e023596ef19b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.122307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1439301a-c008-4af2-bb69-6857397051f3-kube-api-access-wmbrc" (OuterVolumeSpecName: "kube-api-access-wmbrc") pod "1439301a-c008-4af2-bb69-6857397051f3" (UID: "1439301a-c008-4af2-bb69-6857397051f3"). InnerVolumeSpecName "kube-api-access-wmbrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.215359 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmbrc\" (UniqueName: \"kubernetes.io/projected/1439301a-c008-4af2-bb69-6857397051f3-kube-api-access-wmbrc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.626564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-m9gbr" event={"ID":"456e788f-7f0e-429e-8049-e023596ef19b","Type":"ContainerDied","Data":"49d4b7cb85615b630b898934d1f253a6ca59d257f84f3139aca7655fb86de4c0"} Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.627191 4707 scope.go:117] "RemoveContainer" containerID="e7b0396c9755caeb1b5fb02d292d016e76b0c80b2e0a7393ab1998d534ab833f" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.626578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-m9gbr" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.629014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-898jt" event={"ID":"1439301a-c008-4af2-bb69-6857397051f3","Type":"ContainerDied","Data":"d44fe968238a7d2e4f6de6adfb38373c1c63a4670f3c293e4938a09a8722427b"} Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.629042 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d44fe968238a7d2e4f6de6adfb38373c1c63a4670f3c293e4938a09a8722427b" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.629204 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-898jt" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.662449 4707 scope.go:117] "RemoveContainer" containerID="c1eb60ee387c8782ff8ef08dc4c1c48d2927007dd0875469200661e9ba754adb" Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.673775 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-m9gbr"] Feb 18 06:05:11 crc kubenswrapper[4707]: I0218 06:05:11.681855 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-m9gbr"] Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.008175 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.031603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwh7l\" (UniqueName: \"kubernetes.io/projected/80ffc590-9556-4e8b-9faf-ed5df3a747a8-kube-api-access-xwh7l\") pod \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.034468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ffc590-9556-4e8b-9faf-ed5df3a747a8-operator-scripts\") pod \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\" (UID: \"80ffc590-9556-4e8b-9faf-ed5df3a747a8\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.038608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ffc590-9556-4e8b-9faf-ed5df3a747a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80ffc590-9556-4e8b-9faf-ed5df3a747a8" (UID: "80ffc590-9556-4e8b-9faf-ed5df3a747a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.053962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ffc590-9556-4e8b-9faf-ed5df3a747a8-kube-api-access-xwh7l" (OuterVolumeSpecName: "kube-api-access-xwh7l") pod "80ffc590-9556-4e8b-9faf-ed5df3a747a8" (UID: "80ffc590-9556-4e8b-9faf-ed5df3a747a8"). InnerVolumeSpecName "kube-api-access-xwh7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.066269 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456e788f-7f0e-429e-8049-e023596ef19b" path="/var/lib/kubelet/pods/456e788f-7f0e-429e-8049-e023596ef19b/volumes" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.067082 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee9488a-9a13-42d3-bedd-fa39647dd767" path="/var/lib/kubelet/pods/aee9488a-9a13-42d3-bedd-fa39647dd767/volumes" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.140739 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ffc590-9556-4e8b-9faf-ed5df3a747a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.141205 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwh7l\" (UniqueName: \"kubernetes.io/projected/80ffc590-9556-4e8b-9faf-ed5df3a747a8-kube-api-access-xwh7l\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.286334 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w556k" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.296052 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.304182 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.315849 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.345351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8496e6be-0819-4878-a823-31f90c5fd272-operator-scripts\") pod \"8496e6be-0819-4878-a823-31f90c5fd272\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.345740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b042550-a34c-44f7-9a49-29bb4f865dbd-operator-scripts\") pod \"0b042550-a34c-44f7-9a49-29bb4f865dbd\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.345891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4skck\" (UniqueName: \"kubernetes.io/projected/0b042550-a34c-44f7-9a49-29bb4f865dbd-kube-api-access-4skck\") pod \"0b042550-a34c-44f7-9a49-29bb4f865dbd\" (UID: \"0b042550-a34c-44f7-9a49-29bb4f865dbd\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.345964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xxld\" (UniqueName: \"kubernetes.io/projected/8496e6be-0819-4878-a823-31f90c5fd272-kube-api-access-9xxld\") pod \"8496e6be-0819-4878-a823-31f90c5fd272\" (UID: \"8496e6be-0819-4878-a823-31f90c5fd272\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.346053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-operator-scripts\") pod \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.346132 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8496e6be-0819-4878-a823-31f90c5fd272-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8496e6be-0819-4878-a823-31f90c5fd272" (UID: "8496e6be-0819-4878-a823-31f90c5fd272"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.346145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfktn\" (UniqueName: \"kubernetes.io/projected/39f42416-5e4a-475b-862e-71cb10661178-kube-api-access-bfktn\") pod \"39f42416-5e4a-475b-862e-71cb10661178\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.346195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2b8k\" (UniqueName: \"kubernetes.io/projected/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-kube-api-access-p2b8k\") pod \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\" (UID: \"d0fe62a2-3b67-40cf-8d6c-fd68f9667276\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.346266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f42416-5e4a-475b-862e-71cb10661178-operator-scripts\") pod \"39f42416-5e4a-475b-862e-71cb10661178\" (UID: \"39f42416-5e4a-475b-862e-71cb10661178\") " Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.346957 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8496e6be-0819-4878-a823-31f90c5fd272-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.347344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f42416-5e4a-475b-862e-71cb10661178-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39f42416-5e4a-475b-862e-71cb10661178" (UID: "39f42416-5e4a-475b-862e-71cb10661178"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.350559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0fe62a2-3b67-40cf-8d6c-fd68f9667276" (UID: "d0fe62a2-3b67-40cf-8d6c-fd68f9667276"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.351202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b042550-a34c-44f7-9a49-29bb4f865dbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b042550-a34c-44f7-9a49-29bb4f865dbd" (UID: "0b042550-a34c-44f7-9a49-29bb4f865dbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.351282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b042550-a34c-44f7-9a49-29bb4f865dbd-kube-api-access-4skck" (OuterVolumeSpecName: "kube-api-access-4skck") pod "0b042550-a34c-44f7-9a49-29bb4f865dbd" (UID: "0b042550-a34c-44f7-9a49-29bb4f865dbd"). InnerVolumeSpecName "kube-api-access-4skck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.352264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-kube-api-access-p2b8k" (OuterVolumeSpecName: "kube-api-access-p2b8k") pod "d0fe62a2-3b67-40cf-8d6c-fd68f9667276" (UID: "d0fe62a2-3b67-40cf-8d6c-fd68f9667276"). InnerVolumeSpecName "kube-api-access-p2b8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.352548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f42416-5e4a-475b-862e-71cb10661178-kube-api-access-bfktn" (OuterVolumeSpecName: "kube-api-access-bfktn") pod "39f42416-5e4a-475b-862e-71cb10661178" (UID: "39f42416-5e4a-475b-862e-71cb10661178"). InnerVolumeSpecName "kube-api-access-bfktn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.354374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8496e6be-0819-4878-a823-31f90c5fd272-kube-api-access-9xxld" (OuterVolumeSpecName: "kube-api-access-9xxld") pod "8496e6be-0819-4878-a823-31f90c5fd272" (UID: "8496e6be-0819-4878-a823-31f90c5fd272"). InnerVolumeSpecName "kube-api-access-9xxld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.368407 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-297xq" podUID="ea5baf83-32e6-41ec-b14a-d32b3f848be6" containerName="ovn-controller" probeResult="failure" output=< Feb 18 06:05:12 crc kubenswrapper[4707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 18 06:05:12 crc kubenswrapper[4707]: > Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.399535 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.447687 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b042550-a34c-44f7-9a49-29bb4f865dbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.447718 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4skck\" (UniqueName: \"kubernetes.io/projected/0b042550-a34c-44f7-9a49-29bb4f865dbd-kube-api-access-4skck\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.447727 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xxld\" (UniqueName: \"kubernetes.io/projected/8496e6be-0819-4878-a823-31f90c5fd272-kube-api-access-9xxld\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.447737 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.447748 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfktn\" (UniqueName: \"kubernetes.io/projected/39f42416-5e4a-475b-862e-71cb10661178-kube-api-access-bfktn\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.447756 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2b8k\" (UniqueName: \"kubernetes.io/projected/d0fe62a2-3b67-40cf-8d6c-fd68f9667276-kube-api-access-p2b8k\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.447765 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f42416-5e4a-475b-862e-71cb10661178-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.455835 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-f95ql" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.638184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6da2-account-create-update-h65t8" event={"ID":"0b042550-a34c-44f7-9a49-29bb4f865dbd","Type":"ContainerDied","Data":"55bc9ca675fb22e20c0a1fef5605a1b9f561d1e393a919099726b98774530074"} Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.638229 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55bc9ca675fb22e20c0a1fef5605a1b9f561d1e393a919099726b98774530074" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.638279 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6da2-account-create-update-h65t8" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.641526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c09-account-create-update-f5p6n" event={"ID":"8496e6be-0819-4878-a823-31f90c5fd272","Type":"ContainerDied","Data":"6975e10dc8323ccabeaa2274c4561d67a168dbb2f105027341828843f60774b0"} Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.641624 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c09-account-create-update-f5p6n" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.641635 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6975e10dc8323ccabeaa2274c4561d67a168dbb2f105027341828843f60774b0" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.643560 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9bf7-account-create-update-trgcj" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.643556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9bf7-account-create-update-trgcj" event={"ID":"d0fe62a2-3b67-40cf-8d6c-fd68f9667276","Type":"ContainerDied","Data":"46c2da10b43d5db45bb14dbd5155556cd348053c923af10deb68cf50350d12bb"} Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.643660 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c2da10b43d5db45bb14dbd5155556cd348053c923af10deb68cf50350d12bb" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.644999 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w556k" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.645018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w556k" event={"ID":"39f42416-5e4a-475b-862e-71cb10661178","Type":"ContainerDied","Data":"86666ec0321c8446e7695b0e9edb392e305dbcb7d603703a179d726d0f2ae2c4"} Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.645058 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86666ec0321c8446e7695b0e9edb392e305dbcb7d603703a179d726d0f2ae2c4" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.646106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wf8p8" event={"ID":"80ffc590-9556-4e8b-9faf-ed5df3a747a8","Type":"ContainerDied","Data":"0b5f008d92c4f35f3ce400ea3549c83b0a6c2ce92e1aab907ccd083a429c7998"} Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.646128 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b5f008d92c4f35f3ce400ea3549c83b0a6c2ce92e1aab907ccd083a429c7998" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.646199 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wf8p8" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.670811 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-297xq-config-dh48q"] Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671149 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f42416-5e4a-475b-862e-71cb10661178" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671161 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f42416-5e4a-475b-862e-71cb10661178" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ffc590-9556-4e8b-9faf-ed5df3a747a8" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ffc590-9556-4e8b-9faf-ed5df3a747a8" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671202 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456e788f-7f0e-429e-8049-e023596ef19b" containerName="dnsmasq-dns" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671208 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="456e788f-7f0e-429e-8049-e023596ef19b" containerName="dnsmasq-dns" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671219 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1439301a-c008-4af2-bb69-6857397051f3" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671225 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1439301a-c008-4af2-bb69-6857397051f3" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671234 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fe62a2-3b67-40cf-8d6c-fd68f9667276" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671240 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fe62a2-3b67-40cf-8d6c-fd68f9667276" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671259 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8496e6be-0819-4878-a823-31f90c5fd272" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671264 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8496e6be-0819-4878-a823-31f90c5fd272" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671273 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b042550-a34c-44f7-9a49-29bb4f865dbd" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b042550-a34c-44f7-9a49-29bb4f865dbd" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671292 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456e788f-7f0e-429e-8049-e023596ef19b" containerName="init" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671299 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="456e788f-7f0e-429e-8049-e023596ef19b" containerName="init" Feb 18 06:05:12 crc kubenswrapper[4707]: E0218 06:05:12.671308 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee9488a-9a13-42d3-bedd-fa39647dd767" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671314 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee9488a-9a13-42d3-bedd-fa39647dd767" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671449 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f42416-5e4a-475b-862e-71cb10661178" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671460 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="456e788f-7f0e-429e-8049-e023596ef19b" containerName="dnsmasq-dns" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671468 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1439301a-c008-4af2-bb69-6857397051f3" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671478 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ffc590-9556-4e8b-9faf-ed5df3a747a8" containerName="mariadb-database-create" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee9488a-9a13-42d3-bedd-fa39647dd767" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671493 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b042550-a34c-44f7-9a49-29bb4f865dbd" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671504 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8496e6be-0819-4878-a823-31f90c5fd272" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.671513 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fe62a2-3b67-40cf-8d6c-fd68f9667276" containerName="mariadb-account-create-update" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.672017 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.676921 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.694921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-297xq-config-dh48q"] Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.752851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-scripts\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.752931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.752998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-log-ovn\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.753025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-additional-scripts\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.753050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run-ovn\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.753087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv6mc\" (UniqueName: \"kubernetes.io/projected/e3b0c4cf-98d1-4dd0-9f59-40405d125666-kube-api-access-cv6mc\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.854045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-scripts\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.854106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.854157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-log-ovn\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.854179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-additional-scripts\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.854197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run-ovn\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.854226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv6mc\" (UniqueName: \"kubernetes.io/projected/e3b0c4cf-98d1-4dd0-9f59-40405d125666-kube-api-access-cv6mc\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.855032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-log-ovn\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.855102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.855711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-additional-scripts\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.855783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run-ovn\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.856390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-scripts\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:12 crc kubenswrapper[4707]: I0218 06:05:12.868411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv6mc\" (UniqueName: \"kubernetes.io/projected/e3b0c4cf-98d1-4dd0-9f59-40405d125666-kube-api-access-cv6mc\") pod \"ovn-controller-297xq-config-dh48q\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:13 crc kubenswrapper[4707]: I0218 06:05:13.002986 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:13 crc kubenswrapper[4707]: I0218 06:05:13.495184 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-297xq-config-dh48q"] Feb 18 06:05:13 crc kubenswrapper[4707]: W0218 06:05:13.498898 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b0c4cf_98d1_4dd0_9f59_40405d125666.slice/crio-2956befc943796465eb38406012b959b7f42dc876f2ce4adcd1f67a439733a20 WatchSource:0}: Error finding container 2956befc943796465eb38406012b959b7f42dc876f2ce4adcd1f67a439733a20: Status 404 returned error can't find the container with id 2956befc943796465eb38406012b959b7f42dc876f2ce4adcd1f67a439733a20 Feb 18 06:05:13 crc kubenswrapper[4707]: I0218 06:05:13.662696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-297xq-config-dh48q" event={"ID":"e3b0c4cf-98d1-4dd0-9f59-40405d125666","Type":"ContainerStarted","Data":"2956befc943796465eb38406012b959b7f42dc876f2ce4adcd1f67a439733a20"} Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.273068 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hv5xq"] Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.274382 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.276509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.290676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383eff71-3b5a-48e7-9105-9d9932cba1e8-operator-scripts\") pod \"root-account-create-update-hv5xq\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.290998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/383eff71-3b5a-48e7-9105-9d9932cba1e8-kube-api-access-52j77\") pod \"root-account-create-update-hv5xq\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.292448 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hv5xq"] Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.394134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383eff71-3b5a-48e7-9105-9d9932cba1e8-operator-scripts\") pod \"root-account-create-update-hv5xq\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.394416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/383eff71-3b5a-48e7-9105-9d9932cba1e8-kube-api-access-52j77\") pod \"root-account-create-update-hv5xq\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.395222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383eff71-3b5a-48e7-9105-9d9932cba1e8-operator-scripts\") pod \"root-account-create-update-hv5xq\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.428747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/383eff71-3b5a-48e7-9105-9d9932cba1e8-kube-api-access-52j77\") pod \"root-account-create-update-hv5xq\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.589937 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.673513 4707 generic.go:334] "Generic (PLEG): container finished" podID="e3b0c4cf-98d1-4dd0-9f59-40405d125666" containerID="859f719120d27b68300c803790fadce72cbf939390fe2b89fedc2d85f77a04b8" exitCode=0 Feb 18 06:05:14 crc kubenswrapper[4707]: I0218 06:05:14.673566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-297xq-config-dh48q" event={"ID":"e3b0c4cf-98d1-4dd0-9f59-40405d125666","Type":"ContainerDied","Data":"859f719120d27b68300c803790fadce72cbf939390fe2b89fedc2d85f77a04b8"} Feb 18 06:05:15 crc kubenswrapper[4707]: I0218 06:05:15.111387 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hv5xq"] Feb 18 06:05:15 crc kubenswrapper[4707]: I0218 06:05:15.683410 4707 generic.go:334] "Generic (PLEG): container finished" podID="383eff71-3b5a-48e7-9105-9d9932cba1e8" containerID="cb0b2b6d9acbce7710bd04e0bc0ac9801ca77a14b5298962d0f7e26510e708fb" exitCode=0 Feb 18 06:05:15 crc kubenswrapper[4707]: I0218 06:05:15.683514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hv5xq" event={"ID":"383eff71-3b5a-48e7-9105-9d9932cba1e8","Type":"ContainerDied","Data":"cb0b2b6d9acbce7710bd04e0bc0ac9801ca77a14b5298962d0f7e26510e708fb"} Feb 18 06:05:15 crc kubenswrapper[4707]: I0218 06:05:15.683847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hv5xq" event={"ID":"383eff71-3b5a-48e7-9105-9d9932cba1e8","Type":"ContainerStarted","Data":"63d3465abe176d26e3bdbc7cd80cae659c3db627339f1451e6f8534c9dd30777"} Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.020890 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.126294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e3b0c4cf-98d1-4dd0-9f59-40405d125666" (UID: "e3b0c4cf-98d1-4dd0-9f59-40405d125666"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.126362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-additional-scripts\") pod \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.127479 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv6mc\" (UniqueName: \"kubernetes.io/projected/e3b0c4cf-98d1-4dd0-9f59-40405d125666-kube-api-access-cv6mc\") pod \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.127649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run-ovn\") pod \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.127825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-scripts\") pod \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.127879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-log-ovn\") pod \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.127907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run\") pod \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\" (UID: \"e3b0c4cf-98d1-4dd0-9f59-40405d125666\") " Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.128883 4707 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.128928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run" (OuterVolumeSpecName: "var-run") pod "e3b0c4cf-98d1-4dd0-9f59-40405d125666" (UID: "e3b0c4cf-98d1-4dd0-9f59-40405d125666"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.128980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e3b0c4cf-98d1-4dd0-9f59-40405d125666" (UID: "e3b0c4cf-98d1-4dd0-9f59-40405d125666"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.130558 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e3b0c4cf-98d1-4dd0-9f59-40405d125666" (UID: "e3b0c4cf-98d1-4dd0-9f59-40405d125666"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.130512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-scripts" (OuterVolumeSpecName: "scripts") pod "e3b0c4cf-98d1-4dd0-9f59-40405d125666" (UID: "e3b0c4cf-98d1-4dd0-9f59-40405d125666"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.136503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b0c4cf-98d1-4dd0-9f59-40405d125666-kube-api-access-cv6mc" (OuterVolumeSpecName: "kube-api-access-cv6mc") pod "e3b0c4cf-98d1-4dd0-9f59-40405d125666" (UID: "e3b0c4cf-98d1-4dd0-9f59-40405d125666"). InnerVolumeSpecName "kube-api-access-cv6mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.230369 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv6mc\" (UniqueName: \"kubernetes.io/projected/e3b0c4cf-98d1-4dd0-9f59-40405d125666-kube-api-access-cv6mc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.230404 4707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.230416 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b0c4cf-98d1-4dd0-9f59-40405d125666-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.230427 4707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.230436 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3b0c4cf-98d1-4dd0-9f59-40405d125666-var-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.693186 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ff550e2-53ae-4f38-98d1-e95da8f7bde6" containerID="dc4464d0b69c7de0fdb818e8441d8b46e124b5ac6a0d625c606fe481cbd6701c" exitCode=0 Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.693259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rbttv" event={"ID":"4ff550e2-53ae-4f38-98d1-e95da8f7bde6","Type":"ContainerDied","Data":"dc4464d0b69c7de0fdb818e8441d8b46e124b5ac6a0d625c606fe481cbd6701c"} Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.695042 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-297xq-config-dh48q" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.695095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-297xq-config-dh48q" event={"ID":"e3b0c4cf-98d1-4dd0-9f59-40405d125666","Type":"ContainerDied","Data":"2956befc943796465eb38406012b959b7f42dc876f2ce4adcd1f67a439733a20"} Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.695121 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2956befc943796465eb38406012b959b7f42dc876f2ce4adcd1f67a439733a20" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.945415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:16 crc kubenswrapper[4707]: I0218 06:05:16.954486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5253fac6-1dd5-48c7-853a-f7cfa41840fa-etc-swift\") pod \"swift-storage-0\" (UID: \"5253fac6-1dd5-48c7-853a-f7cfa41840fa\") " pod="openstack/swift-storage-0" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.019522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.024659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.046741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383eff71-3b5a-48e7-9105-9d9932cba1e8-operator-scripts\") pod \"383eff71-3b5a-48e7-9105-9d9932cba1e8\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.046891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/383eff71-3b5a-48e7-9105-9d9932cba1e8-kube-api-access-52j77\") pod \"383eff71-3b5a-48e7-9105-9d9932cba1e8\" (UID: \"383eff71-3b5a-48e7-9105-9d9932cba1e8\") " Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.047688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383eff71-3b5a-48e7-9105-9d9932cba1e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "383eff71-3b5a-48e7-9105-9d9932cba1e8" (UID: "383eff71-3b5a-48e7-9105-9d9932cba1e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.051400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383eff71-3b5a-48e7-9105-9d9932cba1e8-kube-api-access-52j77" (OuterVolumeSpecName: "kube-api-access-52j77") pod "383eff71-3b5a-48e7-9105-9d9932cba1e8" (UID: "383eff71-3b5a-48e7-9105-9d9932cba1e8"). InnerVolumeSpecName "kube-api-access-52j77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.106808 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-297xq-config-dh48q"] Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.116520 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-297xq-config-dh48q"] Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.153542 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383eff71-3b5a-48e7-9105-9d9932cba1e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.153563 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/383eff71-3b5a-48e7-9105-9d9932cba1e8-kube-api-access-52j77\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.351962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-297xq" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.563254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 06:05:17 crc kubenswrapper[4707]: W0218 06:05:17.569055 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5253fac6_1dd5_48c7_853a_f7cfa41840fa.slice/crio-b6aca9fe32866b3e87e25ef268752d80bedbc96354f06cc51cbb32c74a18262f WatchSource:0}: Error finding container b6aca9fe32866b3e87e25ef268752d80bedbc96354f06cc51cbb32c74a18262f: Status 404 returned error can't find the container with id b6aca9fe32866b3e87e25ef268752d80bedbc96354f06cc51cbb32c74a18262f Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.702035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"b6aca9fe32866b3e87e25ef268752d80bedbc96354f06cc51cbb32c74a18262f"} Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.704229 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hv5xq" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.704222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hv5xq" event={"ID":"383eff71-3b5a-48e7-9105-9d9932cba1e8","Type":"ContainerDied","Data":"63d3465abe176d26e3bdbc7cd80cae659c3db627339f1451e6f8534c9dd30777"} Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.704284 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d3465abe176d26e3bdbc7cd80cae659c3db627339f1451e6f8534c9dd30777" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.853350 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-czbp6"] Feb 18 06:05:17 crc kubenswrapper[4707]: E0218 06:05:17.853758 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b0c4cf-98d1-4dd0-9f59-40405d125666" containerName="ovn-config" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.853770 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b0c4cf-98d1-4dd0-9f59-40405d125666" containerName="ovn-config" Feb 18 06:05:17 crc kubenswrapper[4707]: E0218 06:05:17.853781 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383eff71-3b5a-48e7-9105-9d9932cba1e8" containerName="mariadb-account-create-update" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.853787 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="383eff71-3b5a-48e7-9105-9d9932cba1e8" containerName="mariadb-account-create-update" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.854021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="383eff71-3b5a-48e7-9105-9d9932cba1e8" containerName="mariadb-account-create-update" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.854035 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b0c4cf-98d1-4dd0-9f59-40405d125666" containerName="ovn-config" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.854561 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.856941 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.857111 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9r2t2" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.860597 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-czbp6"] Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.865541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-db-sync-config-data\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.865606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-combined-ca-bundle\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.865668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48x5k\" (UniqueName: \"kubernetes.io/projected/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-kube-api-access-48x5k\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.865708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-config-data\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.966586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48x5k\" (UniqueName: \"kubernetes.io/projected/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-kube-api-access-48x5k\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.966656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-config-data\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.966714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-db-sync-config-data\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.966777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-combined-ca-bundle\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.972571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-db-sync-config-data\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.972669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-config-data\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.973390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-combined-ca-bundle\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:17 crc kubenswrapper[4707]: I0218 06:05:17.982542 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48x5k\" (UniqueName: \"kubernetes.io/projected/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-kube-api-access-48x5k\") pod \"glance-db-sync-czbp6\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.066695 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b0c4cf-98d1-4dd0-9f59-40405d125666" path="/var/lib/kubelet/pods/e3b0c4cf-98d1-4dd0-9f59-40405d125666/volumes" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.107905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.170524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-ring-data-devices\") pod \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.171633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-dispersionconf\") pod \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.171716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdtlc\" (UniqueName: \"kubernetes.io/projected/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-kube-api-access-hdtlc\") pod \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.171759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-etc-swift\") pod \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.171831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-combined-ca-bundle\") pod \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.171885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-swiftconf\") pod \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.172221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4ff550e2-53ae-4f38-98d1-e95da8f7bde6" (UID: "4ff550e2-53ae-4f38-98d1-e95da8f7bde6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.172303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-scripts\") pod \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\" (UID: \"4ff550e2-53ae-4f38-98d1-e95da8f7bde6\") " Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.172881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4ff550e2-53ae-4f38-98d1-e95da8f7bde6" (UID: "4ff550e2-53ae-4f38-98d1-e95da8f7bde6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.172895 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.176486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-kube-api-access-hdtlc" (OuterVolumeSpecName: "kube-api-access-hdtlc") pod "4ff550e2-53ae-4f38-98d1-e95da8f7bde6" (UID: "4ff550e2-53ae-4f38-98d1-e95da8f7bde6"). InnerVolumeSpecName "kube-api-access-hdtlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.178054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4ff550e2-53ae-4f38-98d1-e95da8f7bde6" (UID: "4ff550e2-53ae-4f38-98d1-e95da8f7bde6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.194519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.198925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-scripts" (OuterVolumeSpecName: "scripts") pod "4ff550e2-53ae-4f38-98d1-e95da8f7bde6" (UID: "4ff550e2-53ae-4f38-98d1-e95da8f7bde6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.199045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ff550e2-53ae-4f38-98d1-e95da8f7bde6" (UID: "4ff550e2-53ae-4f38-98d1-e95da8f7bde6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.207287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4ff550e2-53ae-4f38-98d1-e95da8f7bde6" (UID: "4ff550e2-53ae-4f38-98d1-e95da8f7bde6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.274562 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.274598 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.274609 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdtlc\" (UniqueName: \"kubernetes.io/projected/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-kube-api-access-hdtlc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.274618 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.274626 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.274636 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ff550e2-53ae-4f38-98d1-e95da8f7bde6-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.580731 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-czbp6"] Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.717593 4707 generic.go:334] "Generic (PLEG): container finished" podID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerID="c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b" exitCode=0 Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.717654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"536dddc2-0691-4171-98b1-1462ddf6b38a","Type":"ContainerDied","Data":"c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b"} Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.726911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rbttv" event={"ID":"4ff550e2-53ae-4f38-98d1-e95da8f7bde6","Type":"ContainerDied","Data":"ee91f398a3b8c510e1391fecef3f73a1fef9280a1fe1834dce5994f734652eef"} Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.726979 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee91f398a3b8c510e1391fecef3f73a1fef9280a1fe1834dce5994f734652eef" Feb 18 06:05:18 crc kubenswrapper[4707]: I0218 06:05:18.727078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rbttv" Feb 18 06:05:18 crc kubenswrapper[4707]: W0218 06:05:18.827320 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c15f7d_dfba_43f1_bb09_48d1ce940ed2.slice/crio-2b2c83d75958d68eae0291255bb1e31864f4bc121df2c39f5af5cdc04e898fa5 WatchSource:0}: Error finding container 2b2c83d75958d68eae0291255bb1e31864f4bc121df2c39f5af5cdc04e898fa5: Status 404 returned error can't find the container with id 2b2c83d75958d68eae0291255bb1e31864f4bc121df2c39f5af5cdc04e898fa5 Feb 18 06:05:19 crc kubenswrapper[4707]: I0218 06:05:19.734737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-czbp6" event={"ID":"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2","Type":"ContainerStarted","Data":"2b2c83d75958d68eae0291255bb1e31864f4bc121df2c39f5af5cdc04e898fa5"} Feb 18 06:05:19 crc kubenswrapper[4707]: I0218 06:05:19.737387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"c050949e75f2caa0907b7b3b9926ef244830b38107fb98e56a1b43fcee666c91"} Feb 18 06:05:19 crc kubenswrapper[4707]: I0218 06:05:19.739071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"536dddc2-0691-4171-98b1-1462ddf6b38a","Type":"ContainerStarted","Data":"2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf"} Feb 18 06:05:19 crc kubenswrapper[4707]: I0218 06:05:19.739282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:05:19 crc kubenswrapper[4707]: I0218 06:05:19.775366 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.333985771 podStartE2EDuration="1m16.775349413s" podCreationTimestamp="2026-02-18 06:04:03 +0000 UTC" firstStartedPulling="2026-02-18 06:04:04.998923412 +0000 UTC m=+981.646882546" lastFinishedPulling="2026-02-18 06:04:44.440287034 +0000 UTC m=+1021.088246188" observedRunningTime="2026-02-18 06:05:19.762253244 +0000 UTC m=+1056.410212378" watchObservedRunningTime="2026-02-18 06:05:19.775349413 +0000 UTC m=+1056.423308547" Feb 18 06:05:20 crc kubenswrapper[4707]: I0218 06:05:20.480846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hv5xq"] Feb 18 06:05:20 crc kubenswrapper[4707]: I0218 06:05:20.485569 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hv5xq"] Feb 18 06:05:20 crc kubenswrapper[4707]: I0218 06:05:20.750784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"56c19fdeddc79ef8829fe7cd33d214058d6afb5e219ccfe67d32105784e8b87e"} Feb 18 06:05:20 crc kubenswrapper[4707]: I0218 06:05:20.750841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"8f407e58afc8bca01c7cf65a5dc31e8dc2cddfcd53a6fcf45fd978d1edd03742"} Feb 18 06:05:21 crc kubenswrapper[4707]: I0218 06:05:21.382142 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:05:21 crc kubenswrapper[4707]: I0218 06:05:21.382193 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:05:21 crc kubenswrapper[4707]: I0218 06:05:21.761520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"bfebe8f2504a59b52f252abb017e32e291e5d1ebe6643006ad86970fea929e7c"} Feb 18 06:05:22 crc kubenswrapper[4707]: I0218 06:05:22.073667 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383eff71-3b5a-48e7-9105-9d9932cba1e8" path="/var/lib/kubelet/pods/383eff71-3b5a-48e7-9105-9d9932cba1e8/volumes" Feb 18 06:05:23 crc kubenswrapper[4707]: I0218 06:05:23.780060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"f94ab67a2367af8fa98e3aa686dd68eaee0cd189a0c77fae47ac714c0e100118"} Feb 18 06:05:23 crc kubenswrapper[4707]: I0218 06:05:23.780589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"a826072cf76a320338e67ee494741399ae04282c5b5a5a99deb1f1fe5910aca2"} Feb 18 06:05:23 crc kubenswrapper[4707]: I0218 06:05:23.780601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"a78f46c23a9098fff93c67f798fd72f38394cf5d05095f085c4004fa66096eb6"} Feb 18 06:05:23 crc kubenswrapper[4707]: I0218 06:05:23.780609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"193ab540c0801165d380176ae9d65e47ebb680d45c9b78d73cec11864f3a5d9d"} Feb 18 06:05:23 crc kubenswrapper[4707]: I0218 06:05:23.782013 4707 generic.go:334] "Generic (PLEG): container finished" podID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerID="cd5038c856e6b32cf4df33653ad2d71f042ff62ccb85562b6962da27b37269dd" exitCode=0 Feb 18 06:05:23 crc kubenswrapper[4707]: I0218 06:05:23.782041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"298a4b48-6611-4cb4-8ccf-e9a00c23622b","Type":"ContainerDied","Data":"cd5038c856e6b32cf4df33653ad2d71f042ff62ccb85562b6962da27b37269dd"} Feb 18 06:05:24 crc kubenswrapper[4707]: I0218 06:05:24.794911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"298a4b48-6611-4cb4-8ccf-e9a00c23622b","Type":"ContainerStarted","Data":"fd972a2d3d4bd78031c578208b45afb2e0d2ec70227d127c35345986c0d1abd7"} Feb 18 06:05:24 crc kubenswrapper[4707]: I0218 06:05:24.795369 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 06:05:24 crc kubenswrapper[4707]: I0218 06:05:24.799989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"28c5b821a19fc42dbbd2500f67fea84d9e4cdd09cf0e6e609eb4dc5e34a63c0f"} Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.488568 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371953.366228 podStartE2EDuration="1m23.488548374s" podCreationTimestamp="2026-02-18 06:04:02 +0000 UTC" firstStartedPulling="2026-02-18 06:04:04.771685154 +0000 UTC m=+981.419644278" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:24.828637702 +0000 UTC m=+1061.476596836" watchObservedRunningTime="2026-02-18 06:05:25.488548374 +0000 UTC m=+1062.136507508" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.494183 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gk84m"] Feb 18 06:05:25 crc kubenswrapper[4707]: E0218 06:05:25.494503 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff550e2-53ae-4f38-98d1-e95da8f7bde6" containerName="swift-ring-rebalance" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.494519 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff550e2-53ae-4f38-98d1-e95da8f7bde6" containerName="swift-ring-rebalance" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.494674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff550e2-53ae-4f38-98d1-e95da8f7bde6" containerName="swift-ring-rebalance" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.495127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.505591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.506947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gk84m"] Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.652218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41ea2142-62af-4350-a711-2a7cbe23d990-operator-scripts\") pod \"root-account-create-update-gk84m\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.652701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6hf\" (UniqueName: \"kubernetes.io/projected/41ea2142-62af-4350-a711-2a7cbe23d990-kube-api-access-sf6hf\") pod \"root-account-create-update-gk84m\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.754991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41ea2142-62af-4350-a711-2a7cbe23d990-operator-scripts\") pod \"root-account-create-update-gk84m\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.755075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6hf\" (UniqueName: \"kubernetes.io/projected/41ea2142-62af-4350-a711-2a7cbe23d990-kube-api-access-sf6hf\") pod \"root-account-create-update-gk84m\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.756996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41ea2142-62af-4350-a711-2a7cbe23d990-operator-scripts\") pod \"root-account-create-update-gk84m\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.773918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6hf\" (UniqueName: \"kubernetes.io/projected/41ea2142-62af-4350-a711-2a7cbe23d990-kube-api-access-sf6hf\") pod \"root-account-create-update-gk84m\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.820667 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.824959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"f9aedf4988980b9bf1461ceac16d7c1e2f26fd2a16ff083cae4da9d5e14a5df3"} Feb 18 06:05:25 crc kubenswrapper[4707]: I0218 06:05:25.825029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"c8840fc1a6fc2210e015559dab1aa2018967dea576264ef9695d2eaae144a22e"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.068831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gk84m"] Feb 18 06:05:32 crc kubenswrapper[4707]: W0218 06:05:32.077484 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ea2142_62af_4350_a711_2a7cbe23d990.slice/crio-53ac43c0e17d660d3bc53015c1d6988fd6df7ffbe48db9a5bb5ab1234ef6c60f WatchSource:0}: Error finding container 53ac43c0e17d660d3bc53015c1d6988fd6df7ffbe48db9a5bb5ab1234ef6c60f: Status 404 returned error can't find the container with id 53ac43c0e17d660d3bc53015c1d6988fd6df7ffbe48db9a5bb5ab1234ef6c60f Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.887639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-czbp6" event={"ID":"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2","Type":"ContainerStarted","Data":"93df32c77c64e39dd7b1ddb3f37b56c5a0eb5feee444f7f7657c564fd40e86cd"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.897780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"66788ab301f4eac085b5dd7ce8e2a8dca3f0590d9d5e1e6300a9d0208b412754"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.897853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"e768a308fbf4363b596ccc095e52d20d64ba0bc18b52377cf4037764c77e43da"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.897863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"faf1c32a060fc3e0db962159f9465dfea655252c71cd05ff081817164fd6fba2"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.897872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5253fac6-1dd5-48c7-853a-f7cfa41840fa","Type":"ContainerStarted","Data":"d2857efbc4faafb6160b958d1e0c458dc0f9c8a8c902ded530d7cf41dc9ac229"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.906522 4707 generic.go:334] "Generic (PLEG): container finished" podID="41ea2142-62af-4350-a711-2a7cbe23d990" containerID="7dbc8d02dc0d56f344cc7f14a04727605f837de52038bcd6d542f740b27700ec" exitCode=0 Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.906600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk84m" event={"ID":"41ea2142-62af-4350-a711-2a7cbe23d990","Type":"ContainerDied","Data":"7dbc8d02dc0d56f344cc7f14a04727605f837de52038bcd6d542f740b27700ec"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.906764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk84m" event={"ID":"41ea2142-62af-4350-a711-2a7cbe23d990","Type":"ContainerStarted","Data":"53ac43c0e17d660d3bc53015c1d6988fd6df7ffbe48db9a5bb5ab1234ef6c60f"} Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.919254 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-czbp6" podStartSLOduration=2.996621041 podStartE2EDuration="15.919230443s" podCreationTimestamp="2026-02-18 06:05:17 +0000 UTC" firstStartedPulling="2026-02-18 06:05:18.830073403 +0000 UTC m=+1055.478032537" lastFinishedPulling="2026-02-18 06:05:31.752682805 +0000 UTC m=+1068.400641939" observedRunningTime="2026-02-18 06:05:32.911123656 +0000 UTC m=+1069.559082790" watchObservedRunningTime="2026-02-18 06:05:32.919230443 +0000 UTC m=+1069.567189577" Feb 18 06:05:32 crc kubenswrapper[4707]: I0218 06:05:32.978281 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=27.045627758 podStartE2EDuration="33.978249029s" podCreationTimestamp="2026-02-18 06:04:59 +0000 UTC" firstStartedPulling="2026-02-18 06:05:17.571673932 +0000 UTC m=+1054.219633066" lastFinishedPulling="2026-02-18 06:05:24.504295203 +0000 UTC m=+1061.152254337" observedRunningTime="2026-02-18 06:05:32.960250058 +0000 UTC m=+1069.608209272" watchObservedRunningTime="2026-02-18 06:05:32.978249029 +0000 UTC m=+1069.626208203" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.242348 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-9xdbf"] Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.244520 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.247458 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.265665 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-9xdbf"] Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.427702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-config\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.427830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.427894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.427949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.428025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nnvh\" (UniqueName: \"kubernetes.io/projected/29a831ec-5884-4230-931d-4b62b461ed2e-kube-api-access-4nnvh\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.428079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.529473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.529569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.529603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.529658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nnvh\" (UniqueName: \"kubernetes.io/projected/29a831ec-5884-4230-931d-4b62b461ed2e-kube-api-access-4nnvh\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.529711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.529739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-config\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.530851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-config\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.530871 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.530895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.530876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.530906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.548338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nnvh\" (UniqueName: \"kubernetes.io/projected/29a831ec-5884-4230-931d-4b62b461ed2e-kube-api-access-4nnvh\") pod \"dnsmasq-dns-6d5b6d6b67-9xdbf\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.565026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:33 crc kubenswrapper[4707]: I0218 06:05:33.986268 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-9xdbf"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.088971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.364962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.405736 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-6w79n"] Feb 18 06:05:34 crc kubenswrapper[4707]: E0218 06:05:34.406171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ea2142-62af-4350-a711-2a7cbe23d990" containerName="mariadb-account-create-update" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.406192 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ea2142-62af-4350-a711-2a7cbe23d990" containerName="mariadb-account-create-update" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.406386 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ea2142-62af-4350-a711-2a7cbe23d990" containerName="mariadb-account-create-update" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.407102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.413856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-6w79n"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.446416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf6hf\" (UniqueName: \"kubernetes.io/projected/41ea2142-62af-4350-a711-2a7cbe23d990-kube-api-access-sf6hf\") pod \"41ea2142-62af-4350-a711-2a7cbe23d990\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.446572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41ea2142-62af-4350-a711-2a7cbe23d990-operator-scripts\") pod \"41ea2142-62af-4350-a711-2a7cbe23d990\" (UID: \"41ea2142-62af-4350-a711-2a7cbe23d990\") " Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.447737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ea2142-62af-4350-a711-2a7cbe23d990-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41ea2142-62af-4350-a711-2a7cbe23d990" (UID: "41ea2142-62af-4350-a711-2a7cbe23d990"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.454435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ea2142-62af-4350-a711-2a7cbe23d990-kube-api-access-sf6hf" (OuterVolumeSpecName: "kube-api-access-sf6hf") pod "41ea2142-62af-4350-a711-2a7cbe23d990" (UID: "41ea2142-62af-4350-a711-2a7cbe23d990"). InnerVolumeSpecName "kube-api-access-sf6hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.469985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.498562 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-398f-account-create-update-rf487"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.499925 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.507184 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.549036 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-398f-account-create-update-rf487"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.552718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64slr\" (UniqueName: \"kubernetes.io/projected/1b014870-bb4b-4241-bf1e-1b579389c879-kube-api-access-64slr\") pod \"manila-398f-account-create-update-rf487\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.552788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b014870-bb4b-4241-bf1e-1b579389c879-operator-scripts\") pod \"manila-398f-account-create-update-rf487\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.552995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbw6r\" (UniqueName: \"kubernetes.io/projected/73b7b49a-3f2b-4961-ac42-b426af83bea2-kube-api-access-wbw6r\") pod \"manila-db-create-6w79n\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.553019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73b7b49a-3f2b-4961-ac42-b426af83bea2-operator-scripts\") pod \"manila-db-create-6w79n\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.553061 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf6hf\" (UniqueName: \"kubernetes.io/projected/41ea2142-62af-4350-a711-2a7cbe23d990-kube-api-access-sf6hf\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.553073 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41ea2142-62af-4350-a711-2a7cbe23d990-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.626185 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vn5mm"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.629542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.643676 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vn5mm"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.658473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbw6r\" (UniqueName: \"kubernetes.io/projected/73b7b49a-3f2b-4961-ac42-b426af83bea2-kube-api-access-wbw6r\") pod \"manila-db-create-6w79n\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.658524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73b7b49a-3f2b-4961-ac42-b426af83bea2-operator-scripts\") pod \"manila-db-create-6w79n\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.658585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64slr\" (UniqueName: \"kubernetes.io/projected/1b014870-bb4b-4241-bf1e-1b579389c879-kube-api-access-64slr\") pod \"manila-398f-account-create-update-rf487\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.658624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b014870-bb4b-4241-bf1e-1b579389c879-operator-scripts\") pod \"manila-398f-account-create-update-rf487\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.659652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73b7b49a-3f2b-4961-ac42-b426af83bea2-operator-scripts\") pod \"manila-db-create-6w79n\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.667601 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b014870-bb4b-4241-bf1e-1b579389c879-operator-scripts\") pod \"manila-398f-account-create-update-rf487\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.681642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbw6r\" (UniqueName: \"kubernetes.io/projected/73b7b49a-3f2b-4961-ac42-b426af83bea2-kube-api-access-wbw6r\") pod \"manila-db-create-6w79n\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.689580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64slr\" (UniqueName: \"kubernetes.io/projected/1b014870-bb4b-4241-bf1e-1b579389c879-kube-api-access-64slr\") pod \"manila-398f-account-create-update-rf487\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.722612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-6w79n" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.734774 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6939-account-create-update-747zz"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.735820 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.743457 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.751284 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6939-account-create-update-747zz"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.760381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c96gt\" (UniqueName: \"kubernetes.io/projected/c0171309-d2f6-4ff6-bcd3-ac892477355c-kube-api-access-c96gt\") pod \"cinder-db-create-vn5mm\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.760785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0171309-d2f6-4ff6-bcd3-ac892477355c-operator-scripts\") pod \"cinder-db-create-vn5mm\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.816945 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hcll4"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.818602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.828932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.838142 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hcll4"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.862275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-operator-scripts\") pod \"neutron-db-create-hcll4\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.862514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhl8x\" (UniqueName: \"kubernetes.io/projected/ed0ff79e-f841-48f7-9714-ca0c20783edf-kube-api-access-jhl8x\") pod \"cinder-6939-account-create-update-747zz\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.862614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ff79e-f841-48f7-9714-ca0c20783edf-operator-scripts\") pod \"cinder-6939-account-create-update-747zz\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.862727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c96gt\" (UniqueName: \"kubernetes.io/projected/c0171309-d2f6-4ff6-bcd3-ac892477355c-kube-api-access-c96gt\") pod \"cinder-db-create-vn5mm\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.862835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6jk\" (UniqueName: \"kubernetes.io/projected/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-kube-api-access-ms6jk\") pod \"neutron-db-create-hcll4\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.862953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0171309-d2f6-4ff6-bcd3-ac892477355c-operator-scripts\") pod \"cinder-db-create-vn5mm\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.863806 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0171309-d2f6-4ff6-bcd3-ac892477355c-operator-scripts\") pod \"cinder-db-create-vn5mm\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.910999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c96gt\" (UniqueName: \"kubernetes.io/projected/c0171309-d2f6-4ff6-bcd3-ac892477355c-kube-api-access-c96gt\") pod \"cinder-db-create-vn5mm\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.928591 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dnt8j"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.930424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.942824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk84m" event={"ID":"41ea2142-62af-4350-a711-2a7cbe23d990","Type":"ContainerDied","Data":"53ac43c0e17d660d3bc53015c1d6988fd6df7ffbe48db9a5bb5ab1234ef6c60f"} Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.942862 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53ac43c0e17d660d3bc53015c1d6988fd6df7ffbe48db9a5bb5ab1234ef6c60f" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.942935 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk84m" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.958485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.965461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-operator-scripts\") pod \"neutron-db-create-hcll4\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.965539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhl8x\" (UniqueName: \"kubernetes.io/projected/ed0ff79e-f841-48f7-9714-ca0c20783edf-kube-api-access-jhl8x\") pod \"cinder-6939-account-create-update-747zz\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.965589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ff79e-f841-48f7-9714-ca0c20783edf-operator-scripts\") pod \"cinder-6939-account-create-update-747zz\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.965631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6jk\" (UniqueName: \"kubernetes.io/projected/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-kube-api-access-ms6jk\") pod \"neutron-db-create-hcll4\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.967162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-operator-scripts\") pod \"neutron-db-create-hcll4\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.967296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ff79e-f841-48f7-9714-ca0c20783edf-operator-scripts\") pod \"cinder-6939-account-create-update-747zz\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.977059 4707 generic.go:334] "Generic (PLEG): container finished" podID="29a831ec-5884-4230-931d-4b62b461ed2e" containerID="05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858" exitCode=0 Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.977116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" event={"ID":"29a831ec-5884-4230-931d-4b62b461ed2e","Type":"ContainerDied","Data":"05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858"} Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.977151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" event={"ID":"29a831ec-5884-4230-931d-4b62b461ed2e","Type":"ContainerStarted","Data":"51df5b5d2f38891a4584d5a5bbe2a334eb460c75fda66fd1b2343bafc30f7a06"} Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.982376 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3c9b-account-create-update-qs4bz"] Feb 18 06:05:34 crc kubenswrapper[4707]: I0218 06:05:34.983734 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.002674 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.022879 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dnt8j"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.031932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6jk\" (UniqueName: \"kubernetes.io/projected/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-kube-api-access-ms6jk\") pod \"neutron-db-create-hcll4\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.046952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhl8x\" (UniqueName: \"kubernetes.io/projected/ed0ff79e-f841-48f7-9714-ca0c20783edf-kube-api-access-jhl8x\") pod \"cinder-6939-account-create-update-747zz\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.051189 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3c9b-account-create-update-qs4bz"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.067250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-operator-scripts\") pod \"barbican-db-create-dnt8j\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.067341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-operator-scripts\") pod \"barbican-3c9b-account-create-update-qs4bz\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.067403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9692q\" (UniqueName: \"kubernetes.io/projected/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-kube-api-access-9692q\") pod \"barbican-3c9b-account-create-update-qs4bz\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.067462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpshg\" (UniqueName: \"kubernetes.io/projected/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-kube-api-access-xpshg\") pod \"barbican-db-create-dnt8j\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.073446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.144597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.160862 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-861c-account-create-update-fnpxl"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.169347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-operator-scripts\") pod \"barbican-db-create-dnt8j\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.169414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-operator-scripts\") pod \"barbican-3c9b-account-create-update-qs4bz\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.169475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9692q\" (UniqueName: \"kubernetes.io/projected/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-kube-api-access-9692q\") pod \"barbican-3c9b-account-create-update-qs4bz\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.169541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpshg\" (UniqueName: \"kubernetes.io/projected/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-kube-api-access-xpshg\") pod \"barbican-db-create-dnt8j\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.171858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-operator-scripts\") pod \"barbican-db-create-dnt8j\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.172584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-operator-scripts\") pod \"barbican-3c9b-account-create-update-qs4bz\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.185165 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dhgkp"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.186361 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.186851 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.190742 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.191007 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.191203 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lqdm4" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.191317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.195587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9692q\" (UniqueName: \"kubernetes.io/projected/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-kube-api-access-9692q\") pod \"barbican-3c9b-account-create-update-qs4bz\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.195913 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.200239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpshg\" (UniqueName: \"kubernetes.io/projected/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-kube-api-access-xpshg\") pod \"barbican-db-create-dnt8j\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.210062 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-861c-account-create-update-fnpxl"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.214939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.224838 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dhgkp"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.260289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.399830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-combined-ca-bundle\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.408684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpw2\" (UniqueName: \"kubernetes.io/projected/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-kube-api-access-7rpw2\") pod \"neutron-861c-account-create-update-fnpxl\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.408771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-operator-scripts\") pod \"neutron-861c-account-create-update-fnpxl\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.408809 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-config-data\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.408871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdq7r\" (UniqueName: \"kubernetes.io/projected/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-kube-api-access-xdq7r\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.451113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-6w79n"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.509929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-config-data\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.509970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-operator-scripts\") pod \"neutron-861c-account-create-update-fnpxl\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.509995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdq7r\" (UniqueName: \"kubernetes.io/projected/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-kube-api-access-xdq7r\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.510100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-combined-ca-bundle\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.510128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpw2\" (UniqueName: \"kubernetes.io/projected/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-kube-api-access-7rpw2\") pod \"neutron-861c-account-create-update-fnpxl\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.510741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-operator-scripts\") pod \"neutron-861c-account-create-update-fnpxl\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.517389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-combined-ca-bundle\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.517956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-config-data\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.532204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpw2\" (UniqueName: \"kubernetes.io/projected/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-kube-api-access-7rpw2\") pod \"neutron-861c-account-create-update-fnpxl\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.537721 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdq7r\" (UniqueName: \"kubernetes.io/projected/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-kube-api-access-xdq7r\") pod \"keystone-db-sync-dhgkp\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.575708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.617462 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.645213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-398f-account-create-update-rf487"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.811510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vn5mm"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.831322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6939-account-create-update-747zz"] Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.992277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vn5mm" event={"ID":"c0171309-d2f6-4ff6-bcd3-ac892477355c","Type":"ContainerStarted","Data":"5edc7a9f67aa7bf9c91a9fdfac4d08bd78caff91c1244e807b7b622b5aeeb6b8"} Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.995316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-6w79n" event={"ID":"73b7b49a-3f2b-4961-ac42-b426af83bea2","Type":"ContainerStarted","Data":"70a2eeea7de4052363d0b21f8e94ef8781499bdf50dec7837a09fc61afa0c6ac"} Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.995448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-6w79n" event={"ID":"73b7b49a-3f2b-4961-ac42-b426af83bea2","Type":"ContainerStarted","Data":"97938c1e2a4047f11ee4430af6f6f92897a354831de39fdcf10db486d33fc7c4"} Feb 18 06:05:35 crc kubenswrapper[4707]: I0218 06:05:35.997518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6939-account-create-update-747zz" event={"ID":"ed0ff79e-f841-48f7-9714-ca0c20783edf","Type":"ContainerStarted","Data":"a7759c7cfb919f9b75d0bcabbaa9d1e7765d5c69f72c645cadd73dc0a218be89"} Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.005766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" event={"ID":"29a831ec-5884-4230-931d-4b62b461ed2e","Type":"ContainerStarted","Data":"eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133"} Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.005857 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.007866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-398f-account-create-update-rf487" event={"ID":"1b014870-bb4b-4241-bf1e-1b579389c879","Type":"ContainerStarted","Data":"4cfbcb18f285aab87f52594a6e5485f7310b9269b167ac8b6f9eb153215d6dd9"} Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.007889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-398f-account-create-update-rf487" event={"ID":"1b014870-bb4b-4241-bf1e-1b579389c879","Type":"ContainerStarted","Data":"c27a33e83aba92fd9d688f315c768cce159188704a5e6a0698f7583d026a958e"} Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.019604 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-6w79n" podStartSLOduration=2.019585606 podStartE2EDuration="2.019585606s" podCreationTimestamp="2026-02-18 06:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:36.013045171 +0000 UTC m=+1072.661004295" watchObservedRunningTime="2026-02-18 06:05:36.019585606 +0000 UTC m=+1072.667544740" Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.051896 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hcll4"] Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.121989 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dnt8j"] Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.133450 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-398f-account-create-update-rf487" podStartSLOduration=2.133427026 podStartE2EDuration="2.133427026s" podCreationTimestamp="2026-02-18 06:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:36.035998245 +0000 UTC m=+1072.683957379" watchObservedRunningTime="2026-02-18 06:05:36.133427026 +0000 UTC m=+1072.781386160" Feb 18 06:05:36 crc kubenswrapper[4707]: W0218 06:05:36.138088 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc4e65cf_2f40_47a2_85b6_e46b9d2712f8.slice/crio-f0250b5d05538b679580cb798a17f6f0a013a9942d8e303872bef33e7bd32167 WatchSource:0}: Error finding container f0250b5d05538b679580cb798a17f6f0a013a9942d8e303872bef33e7bd32167: Status 404 returned error can't find the container with id f0250b5d05538b679580cb798a17f6f0a013a9942d8e303872bef33e7bd32167 Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.153298 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3c9b-account-create-update-qs4bz"] Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.160926 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" podStartSLOduration=3.160893299 podStartE2EDuration="3.160893299s" podCreationTimestamp="2026-02-18 06:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:36.068343928 +0000 UTC m=+1072.716303062" watchObservedRunningTime="2026-02-18 06:05:36.160893299 +0000 UTC m=+1072.808852433" Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.171215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-861c-account-create-update-fnpxl"] Feb 18 06:05:36 crc kubenswrapper[4707]: I0218 06:05:36.183565 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dhgkp"] Feb 18 06:05:36 crc kubenswrapper[4707]: W0218 06:05:36.263227 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a675c2d_6f3e_46d9_8efe_7f7e5570b9a4.slice/crio-2270c87cb4affa9430ddb58e0a7ff4bd6a748d7b9df4bd92f2921aabcf7c4afa WatchSource:0}: Error finding container 2270c87cb4affa9430ddb58e0a7ff4bd6a748d7b9df4bd92f2921aabcf7c4afa: Status 404 returned error can't find the container with id 2270c87cb4affa9430ddb58e0a7ff4bd6a748d7b9df4bd92f2921aabcf7c4afa Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.037202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhgkp" event={"ID":"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4","Type":"ContainerStarted","Data":"2270c87cb4affa9430ddb58e0a7ff4bd6a748d7b9df4bd92f2921aabcf7c4afa"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.040755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnt8j" event={"ID":"24b8210d-4e51-4dae-b09c-dbf714c1ca7e","Type":"ContainerStarted","Data":"554122bf0bb556257b07f9548a7bc6c40cc2a78ce08254bfae8c87e86ef6300f"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.040863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnt8j" event={"ID":"24b8210d-4e51-4dae-b09c-dbf714c1ca7e","Type":"ContainerStarted","Data":"0e971a62f57b116d8f5b66b48cbce029129b68cc40bc36559e63caf04bdb555c"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.044010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c9b-account-create-update-qs4bz" event={"ID":"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d","Type":"ContainerStarted","Data":"dfce4a2781663f21df1a056f62db0f42e7fd992c53179ed56a4a5938f2814a4f"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.044045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c9b-account-create-update-qs4bz" event={"ID":"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d","Type":"ContainerStarted","Data":"c2b7ff9859654016a59872c9799c6eb91d311c175b6d101300865f4f1b6b8ba7"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.046535 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed0ff79e-f841-48f7-9714-ca0c20783edf" containerID="8710a9fc36de1d8ea404a56cc48948abfebf625887957d5fe45131c8192c3be2" exitCode=0 Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.046598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6939-account-create-update-747zz" event={"ID":"ed0ff79e-f841-48f7-9714-ca0c20783edf","Type":"ContainerDied","Data":"8710a9fc36de1d8ea404a56cc48948abfebf625887957d5fe45131c8192c3be2"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.053257 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc4e65cf-2f40-47a2-85b6-e46b9d2712f8" containerID="48294d1862605c74084bdb820b5a7da371f2eb93b3dafc638792122652216800" exitCode=0 Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.053302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-861c-account-create-update-fnpxl" event={"ID":"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8","Type":"ContainerDied","Data":"48294d1862605c74084bdb820b5a7da371f2eb93b3dafc638792122652216800"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.053350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-861c-account-create-update-fnpxl" event={"ID":"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8","Type":"ContainerStarted","Data":"f0250b5d05538b679580cb798a17f6f0a013a9942d8e303872bef33e7bd32167"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.058309 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-dnt8j" podStartSLOduration=3.058293041 podStartE2EDuration="3.058293041s" podCreationTimestamp="2026-02-18 06:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:37.05826727 +0000 UTC m=+1073.706226394" watchObservedRunningTime="2026-02-18 06:05:37.058293041 +0000 UTC m=+1073.706252175" Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.061553 4707 generic.go:334] "Generic (PLEG): container finished" podID="1b014870-bb4b-4241-bf1e-1b579389c879" containerID="4cfbcb18f285aab87f52594a6e5485f7310b9269b167ac8b6f9eb153215d6dd9" exitCode=0 Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.061709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-398f-account-create-update-rf487" event={"ID":"1b014870-bb4b-4241-bf1e-1b579389c879","Type":"ContainerDied","Data":"4cfbcb18f285aab87f52594a6e5485f7310b9269b167ac8b6f9eb153215d6dd9"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.063524 4707 generic.go:334] "Generic (PLEG): container finished" podID="c0171309-d2f6-4ff6-bcd3-ac892477355c" containerID="a1f56cd7ec7dbf1cfec672b58cc27ba61f8a0d7bbaf37a5c07a9e4a14ca8e713" exitCode=0 Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.063557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vn5mm" event={"ID":"c0171309-d2f6-4ff6-bcd3-ac892477355c","Type":"ContainerDied","Data":"a1f56cd7ec7dbf1cfec672b58cc27ba61f8a0d7bbaf37a5c07a9e4a14ca8e713"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.066240 4707 generic.go:334] "Generic (PLEG): container finished" podID="73b7b49a-3f2b-4961-ac42-b426af83bea2" containerID="70a2eeea7de4052363d0b21f8e94ef8781499bdf50dec7837a09fc61afa0c6ac" exitCode=0 Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.066337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-6w79n" event={"ID":"73b7b49a-3f2b-4961-ac42-b426af83bea2","Type":"ContainerDied","Data":"70a2eeea7de4052363d0b21f8e94ef8781499bdf50dec7837a09fc61afa0c6ac"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.067940 4707 generic.go:334] "Generic (PLEG): container finished" podID="a33a8c27-088e-4cc4-9447-3e0e2d1e3e85" containerID="67ff09dce43603fcb7ef6b4a16eec7de5d913b3577a068bc70d3b28eb91ef8ae" exitCode=0 Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.068182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hcll4" event={"ID":"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85","Type":"ContainerDied","Data":"67ff09dce43603fcb7ef6b4a16eec7de5d913b3577a068bc70d3b28eb91ef8ae"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.068205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hcll4" event={"ID":"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85","Type":"ContainerStarted","Data":"4700216486e39a98c5ad8d3331783e71ec66e2600db9ba772defc75c2a23ddc6"} Feb 18 06:05:37 crc kubenswrapper[4707]: I0218 06:05:37.086776 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-3c9b-account-create-update-qs4bz" podStartSLOduration=3.086761131 podStartE2EDuration="3.086761131s" podCreationTimestamp="2026-02-18 06:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:37.083120144 +0000 UTC m=+1073.731079278" watchObservedRunningTime="2026-02-18 06:05:37.086761131 +0000 UTC m=+1073.734720265" Feb 18 06:05:38 crc kubenswrapper[4707]: I0218 06:05:38.087682 4707 generic.go:334] "Generic (PLEG): container finished" podID="24b8210d-4e51-4dae-b09c-dbf714c1ca7e" containerID="554122bf0bb556257b07f9548a7bc6c40cc2a78ce08254bfae8c87e86ef6300f" exitCode=0 Feb 18 06:05:38 crc kubenswrapper[4707]: I0218 06:05:38.087872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnt8j" event={"ID":"24b8210d-4e51-4dae-b09c-dbf714c1ca7e","Type":"ContainerDied","Data":"554122bf0bb556257b07f9548a7bc6c40cc2a78ce08254bfae8c87e86ef6300f"} Feb 18 06:05:38 crc kubenswrapper[4707]: I0218 06:05:38.093301 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d" containerID="dfce4a2781663f21df1a056f62db0f42e7fd992c53179ed56a4a5938f2814a4f" exitCode=0 Feb 18 06:05:38 crc kubenswrapper[4707]: I0218 06:05:38.093462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c9b-account-create-update-qs4bz" event={"ID":"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d","Type":"ContainerDied","Data":"dfce4a2781663f21df1a056f62db0f42e7fd992c53179ed56a4a5938f2814a4f"} Feb 18 06:05:40 crc kubenswrapper[4707]: I0218 06:05:40.110762 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" containerID="93df32c77c64e39dd7b1ddb3f37b56c5a0eb5feee444f7f7657c564fd40e86cd" exitCode=0 Feb 18 06:05:40 crc kubenswrapper[4707]: I0218 06:05:40.111310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-czbp6" event={"ID":"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2","Type":"ContainerDied","Data":"93df32c77c64e39dd7b1ddb3f37b56c5a0eb5feee444f7f7657c564fd40e86cd"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.020218 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.029227 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.036899 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.085336 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.098915 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ff79e-f841-48f7-9714-ca0c20783edf-operator-scripts\") pod \"ed0ff79e-f841-48f7-9714-ca0c20783edf\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-operator-scripts\") pod \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpshg\" (UniqueName: \"kubernetes.io/projected/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-kube-api-access-xpshg\") pod \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\" (UID: \"24b8210d-4e51-4dae-b09c-dbf714c1ca7e\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-operator-scripts\") pod \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c96gt\" (UniqueName: \"kubernetes.io/projected/c0171309-d2f6-4ff6-bcd3-ac892477355c-kube-api-access-c96gt\") pod \"c0171309-d2f6-4ff6-bcd3-ac892477355c\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0171309-d2f6-4ff6-bcd3-ac892477355c-operator-scripts\") pod \"c0171309-d2f6-4ff6-bcd3-ac892477355c\" (UID: \"c0171309-d2f6-4ff6-bcd3-ac892477355c\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhl8x\" (UniqueName: \"kubernetes.io/projected/ed0ff79e-f841-48f7-9714-ca0c20783edf-kube-api-access-jhl8x\") pod \"ed0ff79e-f841-48f7-9714-ca0c20783edf\" (UID: \"ed0ff79e-f841-48f7-9714-ca0c20783edf\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64slr\" (UniqueName: \"kubernetes.io/projected/1b014870-bb4b-4241-bf1e-1b579389c879-kube-api-access-64slr\") pod \"1b014870-bb4b-4241-bf1e-1b579389c879\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b014870-bb4b-4241-bf1e-1b579389c879-operator-scripts\") pod \"1b014870-bb4b-4241-bf1e-1b579389c879\" (UID: \"1b014870-bb4b-4241-bf1e-1b579389c879\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.108835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9692q\" (UniqueName: \"kubernetes.io/projected/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-kube-api-access-9692q\") pod \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\" (UID: \"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.109674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24b8210d-4e51-4dae-b09c-dbf714c1ca7e" (UID: "24b8210d-4e51-4dae-b09c-dbf714c1ca7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.109997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b014870-bb4b-4241-bf1e-1b579389c879-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b014870-bb4b-4241-bf1e-1b579389c879" (UID: "1b014870-bb4b-4241-bf1e-1b579389c879"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.110220 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0ff79e-f841-48f7-9714-ca0c20783edf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed0ff79e-f841-48f7-9714-ca0c20783edf" (UID: "ed0ff79e-f841-48f7-9714-ca0c20783edf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.110241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0171309-d2f6-4ff6-bcd3-ac892477355c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0171309-d2f6-4ff6-bcd3-ac892477355c" (UID: "c0171309-d2f6-4ff6-bcd3-ac892477355c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.115401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0ff79e-f841-48f7-9714-ca0c20783edf-kube-api-access-jhl8x" (OuterVolumeSpecName: "kube-api-access-jhl8x") pod "ed0ff79e-f841-48f7-9714-ca0c20783edf" (UID: "ed0ff79e-f841-48f7-9714-ca0c20783edf"). InnerVolumeSpecName "kube-api-access-jhl8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.115676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0171309-d2f6-4ff6-bcd3-ac892477355c-kube-api-access-c96gt" (OuterVolumeSpecName: "kube-api-access-c96gt") pod "c0171309-d2f6-4ff6-bcd3-ac892477355c" (UID: "c0171309-d2f6-4ff6-bcd3-ac892477355c"). InnerVolumeSpecName "kube-api-access-c96gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.117230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d" (UID: "ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.123244 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-kube-api-access-9692q" (OuterVolumeSpecName: "kube-api-access-9692q") pod "ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d" (UID: "ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d"). InnerVolumeSpecName "kube-api-access-9692q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.127722 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.131980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-kube-api-access-xpshg" (OuterVolumeSpecName: "kube-api-access-xpshg") pod "24b8210d-4e51-4dae-b09c-dbf714c1ca7e" (UID: "24b8210d-4e51-4dae-b09c-dbf714c1ca7e"). InnerVolumeSpecName "kube-api-access-xpshg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.138255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b014870-bb4b-4241-bf1e-1b579389c879-kube-api-access-64slr" (OuterVolumeSpecName: "kube-api-access-64slr") pod "1b014870-bb4b-4241-bf1e-1b579389c879" (UID: "1b014870-bb4b-4241-bf1e-1b579389c879"). InnerVolumeSpecName "kube-api-access-64slr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.141856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-398f-account-create-update-rf487" event={"ID":"1b014870-bb4b-4241-bf1e-1b579389c879","Type":"ContainerDied","Data":"c27a33e83aba92fd9d688f315c768cce159188704a5e6a0698f7583d026a958e"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.141938 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27a33e83aba92fd9d688f315c768cce159188704a5e6a0698f7583d026a958e" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.142136 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-398f-account-create-update-rf487" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.149419 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vn5mm" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.149479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vn5mm" event={"ID":"c0171309-d2f6-4ff6-bcd3-ac892477355c","Type":"ContainerDied","Data":"5edc7a9f67aa7bf9c91a9fdfac4d08bd78caff91c1244e807b7b622b5aeeb6b8"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.149573 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5edc7a9f67aa7bf9c91a9fdfac4d08bd78caff91c1244e807b7b622b5aeeb6b8" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.152505 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c9b-account-create-update-qs4bz" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.152773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c9b-account-create-update-qs4bz" event={"ID":"ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d","Type":"ContainerDied","Data":"c2b7ff9859654016a59872c9799c6eb91d311c175b6d101300865f4f1b6b8ba7"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.152842 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b7ff9859654016a59872c9799c6eb91d311c175b6d101300865f4f1b6b8ba7" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.155602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-6w79n" event={"ID":"73b7b49a-3f2b-4961-ac42-b426af83bea2","Type":"ContainerDied","Data":"97938c1e2a4047f11ee4430af6f6f92897a354831de39fdcf10db486d33fc7c4"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.155689 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97938c1e2a4047f11ee4430af6f6f92897a354831de39fdcf10db486d33fc7c4" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.157703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hcll4" event={"ID":"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85","Type":"ContainerDied","Data":"4700216486e39a98c5ad8d3331783e71ec66e2600db9ba772defc75c2a23ddc6"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.157740 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4700216486e39a98c5ad8d3331783e71ec66e2600db9ba772defc75c2a23ddc6" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.159402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6939-account-create-update-747zz" event={"ID":"ed0ff79e-f841-48f7-9714-ca0c20783edf","Type":"ContainerDied","Data":"a7759c7cfb919f9b75d0bcabbaa9d1e7765d5c69f72c645cadd73dc0a218be89"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.159433 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7759c7cfb919f9b75d0bcabbaa9d1e7765d5c69f72c645cadd73dc0a218be89" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.159503 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6939-account-create-update-747zz" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.179349 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-861c-account-create-update-fnpxl" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.179620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-861c-account-create-update-fnpxl" event={"ID":"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8","Type":"ContainerDied","Data":"f0250b5d05538b679580cb798a17f6f0a013a9942d8e303872bef33e7bd32167"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.179672 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0250b5d05538b679580cb798a17f6f0a013a9942d8e303872bef33e7bd32167" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.186331 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnt8j" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.187394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnt8j" event={"ID":"24b8210d-4e51-4dae-b09c-dbf714c1ca7e","Type":"ContainerDied","Data":"0e971a62f57b116d8f5b66b48cbce029129b68cc40bc36559e63caf04bdb555c"} Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.187428 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e971a62f57b116d8f5b66b48cbce029129b68cc40bc36559e63caf04bdb555c" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.191361 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.196081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-6w79n" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213869 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213899 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c96gt\" (UniqueName: \"kubernetes.io/projected/c0171309-d2f6-4ff6-bcd3-ac892477355c-kube-api-access-c96gt\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213910 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0171309-d2f6-4ff6-bcd3-ac892477355c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213919 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhl8x\" (UniqueName: \"kubernetes.io/projected/ed0ff79e-f841-48f7-9714-ca0c20783edf-kube-api-access-jhl8x\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213929 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64slr\" (UniqueName: \"kubernetes.io/projected/1b014870-bb4b-4241-bf1e-1b579389c879-kube-api-access-64slr\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213939 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b014870-bb4b-4241-bf1e-1b579389c879-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213947 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9692q\" (UniqueName: \"kubernetes.io/projected/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d-kube-api-access-9692q\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213955 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ff79e-f841-48f7-9714-ca0c20783edf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213963 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.213971 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpshg\" (UniqueName: \"kubernetes.io/projected/24b8210d-4e51-4dae-b09c-dbf714c1ca7e-kube-api-access-xpshg\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.314596 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-operator-scripts\") pod \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.314680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbw6r\" (UniqueName: \"kubernetes.io/projected/73b7b49a-3f2b-4961-ac42-b426af83bea2-kube-api-access-wbw6r\") pod \"73b7b49a-3f2b-4961-ac42-b426af83bea2\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.314731 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rpw2\" (UniqueName: \"kubernetes.io/projected/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-kube-api-access-7rpw2\") pod \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\" (UID: \"fc4e65cf-2f40-47a2-85b6-e46b9d2712f8\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.314816 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73b7b49a-3f2b-4961-ac42-b426af83bea2-operator-scripts\") pod \"73b7b49a-3f2b-4961-ac42-b426af83bea2\" (UID: \"73b7b49a-3f2b-4961-ac42-b426af83bea2\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.314858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-operator-scripts\") pod \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.314990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms6jk\" (UniqueName: \"kubernetes.io/projected/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-kube-api-access-ms6jk\") pod \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\" (UID: \"a33a8c27-088e-4cc4-9447-3e0e2d1e3e85\") " Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.316024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a33a8c27-088e-4cc4-9447-3e0e2d1e3e85" (UID: "a33a8c27-088e-4cc4-9447-3e0e2d1e3e85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.316211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc4e65cf-2f40-47a2-85b6-e46b9d2712f8" (UID: "fc4e65cf-2f40-47a2-85b6-e46b9d2712f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.316631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b7b49a-3f2b-4961-ac42-b426af83bea2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73b7b49a-3f2b-4961-ac42-b426af83bea2" (UID: "73b7b49a-3f2b-4961-ac42-b426af83bea2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.320656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b7b49a-3f2b-4961-ac42-b426af83bea2-kube-api-access-wbw6r" (OuterVolumeSpecName: "kube-api-access-wbw6r") pod "73b7b49a-3f2b-4961-ac42-b426af83bea2" (UID: "73b7b49a-3f2b-4961-ac42-b426af83bea2"). InnerVolumeSpecName "kube-api-access-wbw6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.320725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-kube-api-access-ms6jk" (OuterVolumeSpecName: "kube-api-access-ms6jk") pod "a33a8c27-088e-4cc4-9447-3e0e2d1e3e85" (UID: "a33a8c27-088e-4cc4-9447-3e0e2d1e3e85"). InnerVolumeSpecName "kube-api-access-ms6jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.321295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-kube-api-access-7rpw2" (OuterVolumeSpecName: "kube-api-access-7rpw2") pod "fc4e65cf-2f40-47a2-85b6-e46b9d2712f8" (UID: "fc4e65cf-2f40-47a2-85b6-e46b9d2712f8"). InnerVolumeSpecName "kube-api-access-7rpw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.416463 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbw6r\" (UniqueName: \"kubernetes.io/projected/73b7b49a-3f2b-4961-ac42-b426af83bea2-kube-api-access-wbw6r\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.416497 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rpw2\" (UniqueName: \"kubernetes.io/projected/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-kube-api-access-7rpw2\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.416508 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73b7b49a-3f2b-4961-ac42-b426af83bea2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.416518 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.416527 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms6jk\" (UniqueName: \"kubernetes.io/projected/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85-kube-api-access-ms6jk\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:41 crc kubenswrapper[4707]: I0218 06:05:41.416535 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.525203 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.723126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-config-data\") pod \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.723211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48x5k\" (UniqueName: \"kubernetes.io/projected/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-kube-api-access-48x5k\") pod \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.723265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-combined-ca-bundle\") pod \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.723293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-db-sync-config-data\") pod \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\" (UID: \"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2\") " Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.727493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-kube-api-access-48x5k" (OuterVolumeSpecName: "kube-api-access-48x5k") pod "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" (UID: "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2"). InnerVolumeSpecName "kube-api-access-48x5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.727678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" (UID: "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.748227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" (UID: "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.766003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-config-data" (OuterVolumeSpecName: "config-data") pod "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" (UID: "e2c15f7d-dfba-43f1-bb09-48d1ce940ed2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.825131 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.825166 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48x5k\" (UniqueName: \"kubernetes.io/projected/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-kube-api-access-48x5k\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.825179 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:41.825190 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.200057 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-czbp6" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.200062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-czbp6" event={"ID":"e2c15f7d-dfba-43f1-bb09-48d1ce940ed2","Type":"ContainerDied","Data":"2b2c83d75958d68eae0291255bb1e31864f4bc121df2c39f5af5cdc04e898fa5"} Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.200184 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b2c83d75958d68eae0291255bb1e31864f4bc121df2c39f5af5cdc04e898fa5" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.208410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhgkp" event={"ID":"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4","Type":"ContainerStarted","Data":"ac11c4a17e423f0e89027de29bb407c5966f3712ef77abad08e296ce9691f732"} Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.208488 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-6w79n" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.209462 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcll4" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.240821 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dhgkp" podStartSLOduration=2.700322925 podStartE2EDuration="7.240804481s" podCreationTimestamp="2026-02-18 06:05:35 +0000 UTC" firstStartedPulling="2026-02-18 06:05:36.269368066 +0000 UTC m=+1072.917327200" lastFinishedPulling="2026-02-18 06:05:40.809849622 +0000 UTC m=+1077.457808756" observedRunningTime="2026-02-18 06:05:42.228414081 +0000 UTC m=+1078.876373215" watchObservedRunningTime="2026-02-18 06:05:42.240804481 +0000 UTC m=+1078.888763615" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.529299 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-9xdbf"] Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.533280 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" podUID="29a831ec-5884-4230-931d-4b62b461ed2e" containerName="dnsmasq-dns" containerID="cri-o://eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133" gracePeriod=10 Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.536031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.579493 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ttslv"] Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.579909 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b7b49a-3f2b-4961-ac42-b426af83bea2" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.579927 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b7b49a-3f2b-4961-ac42-b426af83bea2" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.579951 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33a8c27-088e-4cc4-9447-3e0e2d1e3e85" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.579960 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33a8c27-088e-4cc4-9447-3e0e2d1e3e85" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.579970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4e65cf-2f40-47a2-85b6-e46b9d2712f8" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.579977 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4e65cf-2f40-47a2-85b6-e46b9d2712f8" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.579989 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.579995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.580004 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0ff79e-f841-48f7-9714-ca0c20783edf" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580028 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0ff79e-f841-48f7-9714-ca0c20783edf" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.580041 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0171309-d2f6-4ff6-bcd3-ac892477355c" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580047 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0171309-d2f6-4ff6-bcd3-ac892477355c" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.580056 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" containerName="glance-db-sync" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580064 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" containerName="glance-db-sync" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.580074 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b014870-bb4b-4241-bf1e-1b579389c879" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580080 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b014870-bb4b-4241-bf1e-1b579389c879" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: E0218 06:05:42.580087 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b8210d-4e51-4dae-b09c-dbf714c1ca7e" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580093 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b8210d-4e51-4dae-b09c-dbf714c1ca7e" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580257 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580270 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0ff79e-f841-48f7-9714-ca0c20783edf" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580278 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b7b49a-3f2b-4961-ac42-b426af83bea2" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580287 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" containerName="glance-db-sync" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580298 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33a8c27-088e-4cc4-9447-3e0e2d1e3e85" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580308 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b8210d-4e51-4dae-b09c-dbf714c1ca7e" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580317 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4e65cf-2f40-47a2-85b6-e46b9d2712f8" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580324 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b014870-bb4b-4241-bf1e-1b579389c879" containerName="mariadb-account-create-update" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.580333 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0171309-d2f6-4ff6-bcd3-ac892477355c" containerName="mariadb-database-create" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.581172 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.603806 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ttslv"] Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.643007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.643044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.643175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-779nr\" (UniqueName: \"kubernetes.io/projected/0235fb3c-1926-4b1b-bb30-1fcb473f7744-kube-api-access-779nr\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.643286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.643327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-config\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.643364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-svc\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.744995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-config\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.745052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-svc\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.745092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.745112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.745142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-779nr\" (UniqueName: \"kubernetes.io/projected/0235fb3c-1926-4b1b-bb30-1fcb473f7744-kube-api-access-779nr\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.745212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.746000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-config\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.746095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.746102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-svc\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.746178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.746241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.763111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-779nr\" (UniqueName: \"kubernetes.io/projected/0235fb3c-1926-4b1b-bb30-1fcb473f7744-kube-api-access-779nr\") pod \"dnsmasq-dns-895cf5cf-ttslv\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:42 crc kubenswrapper[4707]: I0218 06:05:42.899357 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.001901 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.050767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-nb\") pod \"29a831ec-5884-4230-931d-4b62b461ed2e\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.050845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nnvh\" (UniqueName: \"kubernetes.io/projected/29a831ec-5884-4230-931d-4b62b461ed2e-kube-api-access-4nnvh\") pod \"29a831ec-5884-4230-931d-4b62b461ed2e\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.050897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-swift-storage-0\") pod \"29a831ec-5884-4230-931d-4b62b461ed2e\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.050933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-svc\") pod \"29a831ec-5884-4230-931d-4b62b461ed2e\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.050987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-config\") pod \"29a831ec-5884-4230-931d-4b62b461ed2e\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.051014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-sb\") pod \"29a831ec-5884-4230-931d-4b62b461ed2e\" (UID: \"29a831ec-5884-4230-931d-4b62b461ed2e\") " Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.067210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a831ec-5884-4230-931d-4b62b461ed2e-kube-api-access-4nnvh" (OuterVolumeSpecName: "kube-api-access-4nnvh") pod "29a831ec-5884-4230-931d-4b62b461ed2e" (UID: "29a831ec-5884-4230-931d-4b62b461ed2e"). InnerVolumeSpecName "kube-api-access-4nnvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.109429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-config" (OuterVolumeSpecName: "config") pod "29a831ec-5884-4230-931d-4b62b461ed2e" (UID: "29a831ec-5884-4230-931d-4b62b461ed2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.121727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29a831ec-5884-4230-931d-4b62b461ed2e" (UID: "29a831ec-5884-4230-931d-4b62b461ed2e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.139534 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29a831ec-5884-4230-931d-4b62b461ed2e" (UID: "29a831ec-5884-4230-931d-4b62b461ed2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.154686 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nnvh\" (UniqueName: \"kubernetes.io/projected/29a831ec-5884-4230-931d-4b62b461ed2e-kube-api-access-4nnvh\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.154717 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.154728 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.155393 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.168250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29a831ec-5884-4230-931d-4b62b461ed2e" (UID: "29a831ec-5884-4230-931d-4b62b461ed2e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.176422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29a831ec-5884-4230-931d-4b62b461ed2e" (UID: "29a831ec-5884-4230-931d-4b62b461ed2e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.222114 4707 generic.go:334] "Generic (PLEG): container finished" podID="29a831ec-5884-4230-931d-4b62b461ed2e" containerID="eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133" exitCode=0 Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.222297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" event={"ID":"29a831ec-5884-4230-931d-4b62b461ed2e","Type":"ContainerDied","Data":"eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133"} Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.223450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" event={"ID":"29a831ec-5884-4230-931d-4b62b461ed2e","Type":"ContainerDied","Data":"51df5b5d2f38891a4584d5a5bbe2a334eb460c75fda66fd1b2343bafc30f7a06"} Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.223479 4707 scope.go:117] "RemoveContainer" containerID="eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.222395 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-9xdbf" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.249894 4707 scope.go:117] "RemoveContainer" containerID="05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.256597 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.256625 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a831ec-5884-4230-931d-4b62b461ed2e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.268852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-9xdbf"] Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.272535 4707 scope.go:117] "RemoveContainer" containerID="eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133" Feb 18 06:05:43 crc kubenswrapper[4707]: E0218 06:05:43.273294 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133\": container with ID starting with eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133 not found: ID does not exist" containerID="eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.273329 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133"} err="failed to get container status \"eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133\": rpc error: code = NotFound desc = could not find container \"eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133\": container with ID starting with eea1c8019e039bfbf3eab58c109a4156cdc66427b7c5d90c79cd5f8336340133 not found: ID does not exist" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.273349 4707 scope.go:117] "RemoveContainer" containerID="05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858" Feb 18 06:05:43 crc kubenswrapper[4707]: E0218 06:05:43.273664 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858\": container with ID starting with 05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858 not found: ID does not exist" containerID="05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.273830 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858"} err="failed to get container status \"05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858\": rpc error: code = NotFound desc = could not find container \"05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858\": container with ID starting with 05ead2f713069421750838119c253f8c376151b156678a0d3281846c314de858 not found: ID does not exist" Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.277569 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-9xdbf"] Feb 18 06:05:43 crc kubenswrapper[4707]: I0218 06:05:43.354638 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ttslv"] Feb 18 06:05:43 crc kubenswrapper[4707]: W0218 06:05:43.362437 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0235fb3c_1926_4b1b_bb30_1fcb473f7744.slice/crio-942cc0f478e810c93d2ae6841ef504c3cdbf27064c09c05958a9dce9c9573b20 WatchSource:0}: Error finding container 942cc0f478e810c93d2ae6841ef504c3cdbf27064c09c05958a9dce9c9573b20: Status 404 returned error can't find the container with id 942cc0f478e810c93d2ae6841ef504c3cdbf27064c09c05958a9dce9c9573b20 Feb 18 06:05:44 crc kubenswrapper[4707]: I0218 06:05:44.065420 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a831ec-5884-4230-931d-4b62b461ed2e" path="/var/lib/kubelet/pods/29a831ec-5884-4230-931d-4b62b461ed2e/volumes" Feb 18 06:05:44 crc kubenswrapper[4707]: I0218 06:05:44.239669 4707 generic.go:334] "Generic (PLEG): container finished" podID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerID="54df48011d93411c1e06102fcafbb5072c599c26449390329373cb368df7fc58" exitCode=0 Feb 18 06:05:44 crc kubenswrapper[4707]: I0218 06:05:44.239741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" event={"ID":"0235fb3c-1926-4b1b-bb30-1fcb473f7744","Type":"ContainerDied","Data":"54df48011d93411c1e06102fcafbb5072c599c26449390329373cb368df7fc58"} Feb 18 06:05:44 crc kubenswrapper[4707]: I0218 06:05:44.239781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" event={"ID":"0235fb3c-1926-4b1b-bb30-1fcb473f7744","Type":"ContainerStarted","Data":"942cc0f478e810c93d2ae6841ef504c3cdbf27064c09c05958a9dce9c9573b20"} Feb 18 06:05:45 crc kubenswrapper[4707]: I0218 06:05:45.248827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" event={"ID":"0235fb3c-1926-4b1b-bb30-1fcb473f7744","Type":"ContainerStarted","Data":"24565e55b736a2340fab33ae6479a4eaadac168a1f2e0d460093984d53cfdc1f"} Feb 18 06:05:45 crc kubenswrapper[4707]: I0218 06:05:45.249277 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:45 crc kubenswrapper[4707]: I0218 06:05:45.250321 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" containerID="ac11c4a17e423f0e89027de29bb407c5966f3712ef77abad08e296ce9691f732" exitCode=0 Feb 18 06:05:45 crc kubenswrapper[4707]: I0218 06:05:45.250351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhgkp" event={"ID":"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4","Type":"ContainerDied","Data":"ac11c4a17e423f0e89027de29bb407c5966f3712ef77abad08e296ce9691f732"} Feb 18 06:05:45 crc kubenswrapper[4707]: I0218 06:05:45.276923 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" podStartSLOduration=3.276897679 podStartE2EDuration="3.276897679s" podCreationTimestamp="2026-02-18 06:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:45.271866594 +0000 UTC m=+1081.919825728" watchObservedRunningTime="2026-02-18 06:05:45.276897679 +0000 UTC m=+1081.924856813" Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.562096 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.714524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-combined-ca-bundle\") pod \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.714593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdq7r\" (UniqueName: \"kubernetes.io/projected/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-kube-api-access-xdq7r\") pod \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.714672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-config-data\") pod \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\" (UID: \"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4\") " Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.722660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-kube-api-access-xdq7r" (OuterVolumeSpecName: "kube-api-access-xdq7r") pod "0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" (UID: "0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4"). InnerVolumeSpecName "kube-api-access-xdq7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.740621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" (UID: "0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.758432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-config-data" (OuterVolumeSpecName: "config-data") pod "0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" (UID: "0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.834640 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.836092 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdq7r\" (UniqueName: \"kubernetes.io/projected/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-kube-api-access-xdq7r\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:46 crc kubenswrapper[4707]: I0218 06:05:46.836548 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.265998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dhgkp" event={"ID":"0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4","Type":"ContainerDied","Data":"2270c87cb4affa9430ddb58e0a7ff4bd6a748d7b9df4bd92f2921aabcf7c4afa"} Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.266450 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2270c87cb4affa9430ddb58e0a7ff4bd6a748d7b9df4bd92f2921aabcf7c4afa" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.266084 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dhgkp" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.568448 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9phtj"] Feb 18 06:05:47 crc kubenswrapper[4707]: E0218 06:05:47.569060 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" containerName="keystone-db-sync" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.569078 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" containerName="keystone-db-sync" Feb 18 06:05:47 crc kubenswrapper[4707]: E0218 06:05:47.569097 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a831ec-5884-4230-931d-4b62b461ed2e" containerName="init" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.569104 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a831ec-5884-4230-931d-4b62b461ed2e" containerName="init" Feb 18 06:05:47 crc kubenswrapper[4707]: E0218 06:05:47.569112 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a831ec-5884-4230-931d-4b62b461ed2e" containerName="dnsmasq-dns" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.569120 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a831ec-5884-4230-931d-4b62b461ed2e" containerName="dnsmasq-dns" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.569389 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a831ec-5884-4230-931d-4b62b461ed2e" containerName="dnsmasq-dns" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.569406 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" containerName="keystone-db-sync" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.570578 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.573362 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.574990 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.575165 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.575568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lqdm4" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.582329 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.589462 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ttslv"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.589671 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" podUID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerName="dnsmasq-dns" containerID="cri-o://24565e55b736a2340fab33ae6479a4eaadac168a1f2e0d460093984d53cfdc1f" gracePeriod=10 Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.602644 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9phtj"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.634880 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-lrczk"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.636943 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.649898 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-lrczk"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.650325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn5g4\" (UniqueName: \"kubernetes.io/projected/396458cb-5b0d-4208-8340-00ed637c67a8-kube-api-access-tn5g4\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.650553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-config\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.650632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-combined-ca-bundle\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.650700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.650771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.650873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-scripts\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.650956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.651081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659c7\" (UniqueName: \"kubernetes.io/projected/5f8b563b-4b84-4a1e-8138-d733174bce8c-kube-api-access-659c7\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.651158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-fernet-keys\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.651233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.651310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-credential-keys\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.651422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-config-data\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-credential-keys\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-config-data\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn5g4\" (UniqueName: \"kubernetes.io/projected/396458cb-5b0d-4208-8340-00ed637c67a8-kube-api-access-tn5g4\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-config\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-combined-ca-bundle\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757374 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-scripts\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-659c7\" (UniqueName: \"kubernetes.io/projected/5f8b563b-4b84-4a1e-8138-d733174bce8c-kube-api-access-659c7\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.757502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-fernet-keys\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.758492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-config\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.759331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.760276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.760548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.761561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.767037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-fernet-keys\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.770324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-config-data\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.777011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-combined-ca-bundle\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.779440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-credential-keys\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.782607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-scripts\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.806528 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-t45dj"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.808097 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.815183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-659c7\" (UniqueName: \"kubernetes.io/projected/5f8b563b-4b84-4a1e-8138-d733174bce8c-kube-api-access-659c7\") pod \"keystone-bootstrap-9phtj\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.823291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn5g4\" (UniqueName: \"kubernetes.io/projected/396458cb-5b0d-4208-8340-00ed637c67a8-kube-api-access-tn5g4\") pod \"dnsmasq-dns-6c9c9f998c-lrczk\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.825404 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.825620 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ncszv" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.825733 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.857493 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t45dj"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.859359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-combined-ca-bundle\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.859411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-config\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.859477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9gcf\" (UniqueName: \"kubernetes.io/projected/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-kube-api-access-h9gcf\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.902152 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-648fb46955-dwn6p"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.903550 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.904187 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.926216 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-648fb46955-dwn6p"] Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.926363 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.926533 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-85ld6" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.926590 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.926787 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.965989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-combined-ca-bundle\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.966036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44pb\" (UniqueName: \"kubernetes.io/projected/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-kube-api-access-w44pb\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.966065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-config\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.966125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9gcf\" (UniqueName: \"kubernetes.io/projected/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-kube-api-access-h9gcf\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.966145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-scripts\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.966182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-config-data\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.966213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-logs\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.966242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-horizon-secret-key\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.972585 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.972839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-combined-ca-bundle\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:47 crc kubenswrapper[4707]: I0218 06:05:47.972839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-config\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.003859 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-w2p4x"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.004606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9gcf\" (UniqueName: \"kubernetes.io/projected/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-kube-api-access-h9gcf\") pod \"neutron-db-sync-t45dj\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.004897 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.012227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.012474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-mtxr9" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w44pb\" (UniqueName: \"kubernetes.io/projected/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-kube-api-access-w44pb\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-job-config-data\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-combined-ca-bundle\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-scripts\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-config-data\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-logs\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-config-data\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-horizon-secret-key\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.067474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrw4\" (UniqueName: \"kubernetes.io/projected/0301370e-0d52-4549-93ba-033d6d706508-kube-api-access-lbrw4\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.074571 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4kl7b"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.075596 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.088362 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.088626 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.088756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-config-data\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.089162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-scripts\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.089421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-logs\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.096936 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nqck2" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.101222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-horizon-secret-key\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.121392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44pb\" (UniqueName: \"kubernetes.io/projected/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-kube-api-access-w44pb\") pod \"horizon-648fb46955-dwn6p\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.156279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-w2p4x"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.169876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-combined-ca-bundle\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.169944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-scripts\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.169978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-config-data\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.169999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-db-sync-config-data\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.170034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-config-data\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.170051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4267g\" (UniqueName: \"kubernetes.io/projected/e5f975a7-ca3d-4940-9281-051360f67955-kube-api-access-4267g\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.170076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrw4\" (UniqueName: \"kubernetes.io/projected/0301370e-0d52-4549-93ba-033d6d706508-kube-api-access-lbrw4\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.170152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f975a7-ca3d-4940-9281-051360f67955-etc-machine-id\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.170172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-combined-ca-bundle\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.170189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-job-config-data\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.171297 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4kl7b"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.181427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-combined-ca-bundle\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.185266 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.187101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.190191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-job-config-data\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.193716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.193846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-config-data\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.194194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.207100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrw4\" (UniqueName: \"kubernetes.io/projected/0301370e-0d52-4549-93ba-033d6d706508-kube-api-access-lbrw4\") pod \"manila-db-sync-w2p4x\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.212187 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-lrczk"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.227309 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t45dj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.232422 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.254991 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bpqnj"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.292847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4267g\" (UniqueName: \"kubernetes.io/projected/e5f975a7-ca3d-4940-9281-051360f67955-kube-api-access-4267g\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.293115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f975a7-ca3d-4940-9281-051360f67955-etc-machine-id\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.293162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-combined-ca-bundle\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.293301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-scripts\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.293378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-config-data\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.293412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-db-sync-config-data\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.294692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f975a7-ca3d-4940-9281-051360f67955-etc-machine-id\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.296878 4707 generic.go:334] "Generic (PLEG): container finished" podID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerID="24565e55b736a2340fab33ae6479a4eaadac168a1f2e0d460093984d53cfdc1f" exitCode=0 Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.310644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" event={"ID":"0235fb3c-1926-4b1b-bb30-1fcb473f7744","Type":"ContainerDied","Data":"24565e55b736a2340fab33ae6479a4eaadac168a1f2e0d460093984d53cfdc1f"} Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.310703 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bpqnj"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.310828 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.311420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-combined-ca-bundle\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.313354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-db-sync-config-data\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.333778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-config-data\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.334331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-scripts\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.335915 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sqgvm" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.336726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.342028 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.343118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4267g\" (UniqueName: \"kubernetes.io/projected/e5f975a7-ca3d-4940-9281-051360f67955-kube-api-access-4267g\") pod \"cinder-db-sync-4kl7b\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.359634 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hdgrh"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.363405 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.363626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-w2p4x" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.391110 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4gf8v"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.392681 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.395567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6lwn9" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.396668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.396728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-run-httpd\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.396756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-scripts\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.396781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.396830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-config-data\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.396853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-log-httpd\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.396878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ggsw\" (UniqueName: \"kubernetes.io/projected/835f51c5-3996-4120-bd2f-4b3bef33c31d-kube-api-access-7ggsw\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.400869 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.401113 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.417775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.444861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hdgrh"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.479961 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4gf8v"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrcsl\" (UniqueName: \"kubernetes.io/projected/27b6874e-b7bf-4d36-8ea1-66cf492ba327-kube-api-access-zrcsl\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-config-data\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-db-sync-config-data\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-log-httpd\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz2kr\" (UniqueName: \"kubernetes.io/projected/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-kube-api-access-zz2kr\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ggsw\" (UniqueName: \"kubernetes.io/projected/835f51c5-3996-4120-bd2f-4b3bef33c31d-kube-api-access-7ggsw\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-scripts\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgvl9\" (UniqueName: \"kubernetes.io/projected/74700413-a033-43e0-b01c-04c21e097135-kube-api-access-hgvl9\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b6874e-b7bf-4d36-8ea1-66cf492ba327-logs\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.501996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-config\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-run-httpd\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-combined-ca-bundle\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-scripts\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-config-data\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-combined-ca-bundle\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.502162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.507119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-log-httpd\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.510495 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.511896 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.512429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-scripts\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.512679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-run-httpd\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.515079 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.515283 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.515414 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.519681 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9r2t2" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.519903 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.543938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.547214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.564077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-config-data\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.584900 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.595266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ggsw\" (UniqueName: \"kubernetes.io/projected/835f51c5-3996-4120-bd2f-4b3bef33c31d-kube-api-access-7ggsw\") pod \"ceilometer-0\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.609653 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c594f6b9-4cz2s"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.611929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b6874e-b7bf-4d36-8ea1-66cf492ba327-logs\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.611990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-config\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-combined-ca-bundle\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-config-data\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-combined-ca-bundle\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrcsl\" (UniqueName: \"kubernetes.io/projected/27b6874e-b7bf-4d36-8ea1-66cf492ba327-kube-api-access-zrcsl\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-db-sync-config-data\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz2kr\" (UniqueName: \"kubernetes.io/projected/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-kube-api-access-zz2kr\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-scripts\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.612458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgvl9\" (UniqueName: \"kubernetes.io/projected/74700413-a033-43e0-b01c-04c21e097135-kube-api-access-hgvl9\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.613157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b6874e-b7bf-4d36-8ea1-66cf492ba327-logs\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.613902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.614426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-config\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.616167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.617248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.623098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.623931 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.624715 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.628241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-config-data\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.633305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-db-sync-config-data\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.650240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-combined-ca-bundle\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.651028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-scripts\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.659870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-combined-ca-bundle\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.663091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrcsl\" (UniqueName: \"kubernetes.io/projected/27b6874e-b7bf-4d36-8ea1-66cf492ba327-kube-api-access-zrcsl\") pod \"placement-db-sync-4gf8v\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.671840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgvl9\" (UniqueName: \"kubernetes.io/projected/74700413-a033-43e0-b01c-04c21e097135-kube-api-access-hgvl9\") pod \"dnsmasq-dns-57c957c4ff-hdgrh\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.675300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz2kr\" (UniqueName: \"kubernetes.io/projected/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-kube-api-access-zz2kr\") pod \"barbican-db-sync-bpqnj\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.683363 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c594f6b9-4cz2s"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.713863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-config\") pod \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.725990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-nb\") pod \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.726500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-swift-storage-0\") pod \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.726537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-svc\") pod \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.726588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-779nr\" (UniqueName: \"kubernetes.io/projected/0235fb3c-1926-4b1b-bb30-1fcb473f7744-kube-api-access-779nr\") pod \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.726627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-sb\") pod \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\" (UID: \"0235fb3c-1926-4b1b-bb30-1fcb473f7744\") " Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.726903 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-ceph\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.726931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-scripts\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.726946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0625e994-776d-4baf-a5b8-557b0ec5e11e-logs\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-config-data\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g9tw\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-kube-api-access-9g9tw\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-scripts\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0625e994-776d-4baf-a5b8-557b0ec5e11e-horizon-secret-key\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwfv\" (UniqueName: \"kubernetes.io/projected/0625e994-776d-4baf-a5b8-557b0ec5e11e-kube-api-access-czwfv\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-logs\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.727377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-config-data\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.740833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.741341 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0235fb3c-1926-4b1b-bb30-1fcb473f7744-kube-api-access-779nr" (OuterVolumeSpecName: "kube-api-access-779nr") pod "0235fb3c-1926-4b1b-bb30-1fcb473f7744" (UID: "0235fb3c-1926-4b1b-bb30-1fcb473f7744"). InnerVolumeSpecName "kube-api-access-779nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.787605 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.832506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.833644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0625e994-776d-4baf-a5b8-557b0ec5e11e-horizon-secret-key\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.833678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czwfv\" (UniqueName: \"kubernetes.io/projected/0625e994-776d-4baf-a5b8-557b0ec5e11e-kube-api-access-czwfv\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.833702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.833718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-logs\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.834975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-config-data\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-ceph\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-scripts\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0625e994-776d-4baf-a5b8-557b0ec5e11e-logs\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-config-data\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g9tw\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-kube-api-access-9g9tw\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-scripts\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.835816 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-779nr\" (UniqueName: \"kubernetes.io/projected/0235fb3c-1926-4b1b-bb30-1fcb473f7744-kube-api-access-779nr\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.837633 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.843522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-logs\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.847365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.850313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0625e994-776d-4baf-a5b8-557b0ec5e11e-logs\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.854150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-scripts\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.855104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-scripts\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.855531 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4gf8v" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.856565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.874981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-ceph\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.876724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.879296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwfv\" (UniqueName: \"kubernetes.io/projected/0625e994-776d-4baf-a5b8-557b0ec5e11e-kube-api-access-czwfv\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.880316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0625e994-776d-4baf-a5b8-557b0ec5e11e-horizon-secret-key\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.880912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-config-data\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.882504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-config-data\") pod \"horizon-8c594f6b9-4cz2s\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.904271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g9tw\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-kube-api-access-9g9tw\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.926280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0235fb3c-1926-4b1b-bb30-1fcb473f7744" (UID: "0235fb3c-1926-4b1b-bb30-1fcb473f7744"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.939976 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.950642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0235fb3c-1926-4b1b-bb30-1fcb473f7744" (UID: "0235fb3c-1926-4b1b-bb30-1fcb473f7744"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.951011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-config" (OuterVolumeSpecName: "config") pod "0235fb3c-1926-4b1b-bb30-1fcb473f7744" (UID: "0235fb3c-1926-4b1b-bb30-1fcb473f7744"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.958098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.965369 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-lrczk"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.975834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9phtj"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.977076 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.987360 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:48 crc kubenswrapper[4707]: E0218 06:05:48.987981 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerName="init" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.987997 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerName="init" Feb 18 06:05:48 crc kubenswrapper[4707]: E0218 06:05:48.988018 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerName="dnsmasq-dns" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.988025 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerName="dnsmasq-dns" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.988096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0235fb3c-1926-4b1b-bb30-1fcb473f7744" (UID: "0235fb3c-1926-4b1b-bb30-1fcb473f7744"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.988297 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" containerName="dnsmasq-dns" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.989498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.994386 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.994477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.996692 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:48 crc kubenswrapper[4707]: I0218 06:05:48.998514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0235fb3c-1926-4b1b-bb30-1fcb473f7744" (UID: "0235fb3c-1926-4b1b-bb30-1fcb473f7744"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.044754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.044845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.044915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.044939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjc7\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-kube-api-access-2fjc7\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045367 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045380 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045418 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.045427 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0235fb3c-1926-4b1b-bb30-1fcb473f7744-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.048874 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.142142 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t45dj"] Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjc7\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-kube-api-access-2fjc7\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.146782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.148896 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.149678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.151152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.154674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.154700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.158327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.159987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.165176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjc7\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-kube-api-access-2fjc7\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.168717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.256712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.323739 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-648fb46955-dwn6p"] Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.357590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" event={"ID":"0235fb3c-1926-4b1b-bb30-1fcb473f7744","Type":"ContainerDied","Data":"942cc0f478e810c93d2ae6841ef504c3cdbf27064c09c05958a9dce9c9573b20"} Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.357637 4707 scope.go:117] "RemoveContainer" containerID="24565e55b736a2340fab33ae6479a4eaadac168a1f2e0d460093984d53cfdc1f" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.357633 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-ttslv" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.370701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" event={"ID":"396458cb-5b0d-4208-8340-00ed637c67a8","Type":"ContainerStarted","Data":"27702f8f3508e44f5c1079f1cd78cdc56ddcfb86e60744e42023484b7d72a430"} Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.380757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t45dj" event={"ID":"668b58a1-6a64-4356-a00f-ccf6faa1ce3b","Type":"ContainerStarted","Data":"30b31f6d120cebfd4d3ffd0a9b5744c97f4501c8829918091e42a179ae66aa6f"} Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.383650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.390352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9phtj" event={"ID":"5f8b563b-4b84-4a1e-8138-d733174bce8c","Type":"ContainerStarted","Data":"bac7ce3a159130b516bc2854b246ac081f8f40c89838ae489279310fbe5ab978"} Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.412428 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9phtj" podStartSLOduration=2.412405102 podStartE2EDuration="2.412405102s" podCreationTimestamp="2026-02-18 06:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:49.411345955 +0000 UTC m=+1086.059305089" watchObservedRunningTime="2026-02-18 06:05:49.412405102 +0000 UTC m=+1086.060364246" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.455204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4kl7b"] Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.485441 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ttslv"] Feb 18 06:05:49 crc kubenswrapper[4707]: W0218 06:05:49.485813 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5f975a7_ca3d_4940_9281_051360f67955.slice/crio-ea3c5044cbd6eac9115a2e2869ae64f1f5a389c412af93229e95fe17dddbe77f WatchSource:0}: Error finding container ea3c5044cbd6eac9115a2e2869ae64f1f5a389c412af93229e95fe17dddbe77f: Status 404 returned error can't find the container with id ea3c5044cbd6eac9115a2e2869ae64f1f5a389c412af93229e95fe17dddbe77f Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.493898 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-ttslv"] Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.502532 4707 scope.go:117] "RemoveContainer" containerID="54df48011d93411c1e06102fcafbb5072c599c26449390329373cb368df7fc58" Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.911921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4gf8v"] Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.943876 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:49 crc kubenswrapper[4707]: I0218 06:05:49.955589 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bpqnj"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.015366 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c594f6b9-4cz2s"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.030581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hdgrh"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.096701 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0235fb3c-1926-4b1b-bb30-1fcb473f7744" path="/var/lib/kubelet/pods/0235fb3c-1926-4b1b-bb30-1fcb473f7744/volumes" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.097469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-w2p4x"] Feb 18 06:05:50 crc kubenswrapper[4707]: W0218 06:05:50.245018 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cc3d982_7089_42d5_84d6_d043a6fddbfe.slice/crio-664398e684beea5b317daa1d6070644676d48c3429823a58c9799dcf39c0b662 WatchSource:0}: Error finding container 664398e684beea5b317daa1d6070644676d48c3429823a58c9799dcf39c0b662: Status 404 returned error can't find the container with id 664398e684beea5b317daa1d6070644676d48c3429823a58c9799dcf39c0b662 Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.245918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.355899 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.405053 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.412046 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-648fb46955-dwn6p"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.436996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqnj" event={"ID":"3f8ab203-7caa-4df6-9c0c-599a9d1b9612","Type":"ContainerStarted","Data":"156ed10aff4de870ab0172c570e6f3c7a7c4520bb38a1221165bfe2af07d1e43"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.446686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-w2p4x" event={"ID":"0301370e-0d52-4549-93ba-033d6d706508","Type":"ContainerStarted","Data":"74847e87daa752b0d5e875a6a5a25511a8e43aef7fad69ac2f64536344e00109"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.449189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cc3d982-7089-42d5-84d6-d043a6fddbfe","Type":"ContainerStarted","Data":"664398e684beea5b317daa1d6070644676d48c3429823a58c9799dcf39c0b662"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.451433 4707 generic.go:334] "Generic (PLEG): container finished" podID="396458cb-5b0d-4208-8340-00ed637c67a8" containerID="a4376adb83560793824b20043bdb85879e0b522d85230c5606caa630ef771651" exitCode=0 Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.451584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" event={"ID":"396458cb-5b0d-4208-8340-00ed637c67a8","Type":"ContainerDied","Data":"a4376adb83560793824b20043bdb85879e0b522d85230c5606caa630ef771651"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.472606 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d8cc4f5b5-jbjl5"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.473994 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.505196 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" event={"ID":"74700413-a033-43e0-b01c-04c21e097135","Type":"ContainerStarted","Data":"dac8966104dc9881302c8a9a0af2681def4fd165b5a152838663a309a155b071"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.509915 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.546033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c594f6b9-4cz2s" event={"ID":"0625e994-776d-4baf-a5b8-557b0ec5e11e","Type":"ContainerStarted","Data":"18e90572167acff79624676fb616b331e27cc7f0a0dfc135a01f5615c7aff2b0"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.569641 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d8cc4f5b5-jbjl5"] Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.598639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-config-data\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.598753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82lmz\" (UniqueName: \"kubernetes.io/projected/f21bf944-9893-43da-99fc-5af82d67a34b-kube-api-access-82lmz\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.598785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-scripts\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.598845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f21bf944-9893-43da-99fc-5af82d67a34b-horizon-secret-key\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.598867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f21bf944-9893-43da-99fc-5af82d67a34b-logs\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.601851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kl7b" event={"ID":"e5f975a7-ca3d-4940-9281-051360f67955","Type":"ContainerStarted","Data":"ea3c5044cbd6eac9115a2e2869ae64f1f5a389c412af93229e95fe17dddbe77f"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.609046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerStarted","Data":"2b9450f88020908d3d06a4f4c8f64efc597db4aa9b69a0fc71b21bcbc35f64c4"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.613733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4gf8v" event={"ID":"27b6874e-b7bf-4d36-8ea1-66cf492ba327","Type":"ContainerStarted","Data":"ae9c35ad76f7ed6fa22fa3ae8fef683c38e5901d174e50070616b6c380019cbe"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.637131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t45dj" event={"ID":"668b58a1-6a64-4356-a00f-ccf6faa1ce3b","Type":"ContainerStarted","Data":"3a93156ee1b4f3e91620c4bde12236aa1e4ef2c92c4112e24d4e0a8a2e806da6"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.645358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-648fb46955-dwn6p" event={"ID":"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e","Type":"ContainerStarted","Data":"14107fc38e75d6471edc68b9ba4d96842af92920d1947711f938237327b75f47"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.657914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9phtj" event={"ID":"5f8b563b-4b84-4a1e-8138-d733174bce8c","Type":"ContainerStarted","Data":"3a273d301b95ba0c04b5d4dfd0516aead8108ca6155655dfd482f7d65bf1f1c6"} Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.700613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-config-data\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.700743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82lmz\" (UniqueName: \"kubernetes.io/projected/f21bf944-9893-43da-99fc-5af82d67a34b-kube-api-access-82lmz\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.700776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-scripts\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.700843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f21bf944-9893-43da-99fc-5af82d67a34b-horizon-secret-key\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.700864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f21bf944-9893-43da-99fc-5af82d67a34b-logs\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.701527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f21bf944-9893-43da-99fc-5af82d67a34b-logs\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.702429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-scripts\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.703161 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-config-data\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.719510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82lmz\" (UniqueName: \"kubernetes.io/projected/f21bf944-9893-43da-99fc-5af82d67a34b-kube-api-access-82lmz\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.724968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f21bf944-9893-43da-99fc-5af82d67a34b-horizon-secret-key\") pod \"horizon-5d8cc4f5b5-jbjl5\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.854832 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.962060 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-t45dj" podStartSLOduration=3.96204104 podStartE2EDuration="3.96204104s" podCreationTimestamp="2026-02-18 06:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:50.668022279 +0000 UTC m=+1087.315981423" watchObservedRunningTime="2026-02-18 06:05:50.96204104 +0000 UTC m=+1087.610000174" Feb 18 06:05:50 crc kubenswrapper[4707]: I0218 06:05:50.967447 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.111022 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.214169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-nb\") pod \"396458cb-5b0d-4208-8340-00ed637c67a8\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.214203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-sb\") pod \"396458cb-5b0d-4208-8340-00ed637c67a8\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.215276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-svc\") pod \"396458cb-5b0d-4208-8340-00ed637c67a8\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.215524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-config\") pod \"396458cb-5b0d-4208-8340-00ed637c67a8\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.215560 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn5g4\" (UniqueName: \"kubernetes.io/projected/396458cb-5b0d-4208-8340-00ed637c67a8-kube-api-access-tn5g4\") pod \"396458cb-5b0d-4208-8340-00ed637c67a8\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.215587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-swift-storage-0\") pod \"396458cb-5b0d-4208-8340-00ed637c67a8\" (UID: \"396458cb-5b0d-4208-8340-00ed637c67a8\") " Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.223147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396458cb-5b0d-4208-8340-00ed637c67a8-kube-api-access-tn5g4" (OuterVolumeSpecName: "kube-api-access-tn5g4") pod "396458cb-5b0d-4208-8340-00ed637c67a8" (UID: "396458cb-5b0d-4208-8340-00ed637c67a8"). InnerVolumeSpecName "kube-api-access-tn5g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.242077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "396458cb-5b0d-4208-8340-00ed637c67a8" (UID: "396458cb-5b0d-4208-8340-00ed637c67a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.246015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-config" (OuterVolumeSpecName: "config") pod "396458cb-5b0d-4208-8340-00ed637c67a8" (UID: "396458cb-5b0d-4208-8340-00ed637c67a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.261300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "396458cb-5b0d-4208-8340-00ed637c67a8" (UID: "396458cb-5b0d-4208-8340-00ed637c67a8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.261661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "396458cb-5b0d-4208-8340-00ed637c67a8" (UID: "396458cb-5b0d-4208-8340-00ed637c67a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.269702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "396458cb-5b0d-4208-8340-00ed637c67a8" (UID: "396458cb-5b0d-4208-8340-00ed637c67a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.317654 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.317685 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.317695 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn5g4\" (UniqueName: \"kubernetes.io/projected/396458cb-5b0d-4208-8340-00ed637c67a8-kube-api-access-tn5g4\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.317714 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.317723 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.317732 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/396458cb-5b0d-4208-8340-00ed637c67a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.382607 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.382654 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.459027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d8cc4f5b5-jbjl5"] Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.684277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cc3d982-7089-42d5-84d6-d043a6fddbfe","Type":"ContainerStarted","Data":"cc9fb961bf97a03dc9897ee5392ff906f7ab810a5a5158fb6791e7be27165dd1"} Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.694480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" event={"ID":"396458cb-5b0d-4208-8340-00ed637c67a8","Type":"ContainerDied","Data":"27702f8f3508e44f5c1079f1cd78cdc56ddcfb86e60744e42023484b7d72a430"} Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.694516 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-lrczk" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.694545 4707 scope.go:117] "RemoveContainer" containerID="a4376adb83560793824b20043bdb85879e0b522d85230c5606caa630ef771651" Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.703881 4707 generic.go:334] "Generic (PLEG): container finished" podID="74700413-a033-43e0-b01c-04c21e097135" containerID="8001a3fee03d5ce3fdbefa5a329bdef9cc9e4d29cebf335e7307a31daeb7f800" exitCode=0 Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.703962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" event={"ID":"74700413-a033-43e0-b01c-04c21e097135","Type":"ContainerDied","Data":"8001a3fee03d5ce3fdbefa5a329bdef9cc9e4d29cebf335e7307a31daeb7f800"} Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.707554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d8cc4f5b5-jbjl5" event={"ID":"f21bf944-9893-43da-99fc-5af82d67a34b","Type":"ContainerStarted","Data":"e8d9f397f6ecbaad4e93ec91adc4ae059cbd3f16b1d1ee7a1c38560fa5c4f7d9"} Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.713736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d33e59a5-30a9-421e-bff0-81b5fa831347","Type":"ContainerStarted","Data":"74267fe7baa2484b92d134d97da6deeede50cacefb86a30ded2061ab8db4036f"} Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.781768 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-lrczk"] Feb 18 06:05:51 crc kubenswrapper[4707]: I0218 06:05:51.794443 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-lrczk"] Feb 18 06:05:52 crc kubenswrapper[4707]: I0218 06:05:52.076293 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396458cb-5b0d-4208-8340-00ed637c67a8" path="/var/lib/kubelet/pods/396458cb-5b0d-4208-8340-00ed637c67a8/volumes" Feb 18 06:05:52 crc kubenswrapper[4707]: I0218 06:05:52.757476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d33e59a5-30a9-421e-bff0-81b5fa831347","Type":"ContainerStarted","Data":"9a927c7e6a1dd384f92770bbefa0c1a08da8a90b861fac8757fdf67122c93d2a"} Feb 18 06:05:52 crc kubenswrapper[4707]: I0218 06:05:52.767556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" event={"ID":"74700413-a033-43e0-b01c-04c21e097135","Type":"ContainerStarted","Data":"da811365dd7843c486e68f03e585fd831a31bf044b9039fb00616cd3ce326c08"} Feb 18 06:05:52 crc kubenswrapper[4707]: I0218 06:05:52.768759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:52 crc kubenswrapper[4707]: I0218 06:05:52.794584 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" podStartSLOduration=4.794565421 podStartE2EDuration="4.794565421s" podCreationTimestamp="2026-02-18 06:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:52.788881109 +0000 UTC m=+1089.436840243" watchObservedRunningTime="2026-02-18 06:05:52.794565421 +0000 UTC m=+1089.442524555" Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.825361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d33e59a5-30a9-421e-bff0-81b5fa831347","Type":"ContainerStarted","Data":"21a1e13e92cf035a710054e2d9234fe71fc6780b376905ee70fa92424ddcec77"} Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.825476 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-log" containerID="cri-o://9a927c7e6a1dd384f92770bbefa0c1a08da8a90b861fac8757fdf67122c93d2a" gracePeriod=30 Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.825902 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-httpd" containerID="cri-o://21a1e13e92cf035a710054e2d9234fe71fc6780b376905ee70fa92424ddcec77" gracePeriod=30 Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.839913 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cc3d982-7089-42d5-84d6-d043a6fddbfe","Type":"ContainerStarted","Data":"bc680cffb8b9228f3585dc8539a3c1fc64b473ab07f48a5cedeeb4beabf17e79"} Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.841001 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-log" containerID="cri-o://cc9fb961bf97a03dc9897ee5392ff906f7ab810a5a5158fb6791e7be27165dd1" gracePeriod=30 Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.841257 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-httpd" containerID="cri-o://bc680cffb8b9228f3585dc8539a3c1fc64b473ab07f48a5cedeeb4beabf17e79" gracePeriod=30 Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.877844 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.877825226 podStartE2EDuration="5.877825226s" podCreationTimestamp="2026-02-18 06:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:53.876336126 +0000 UTC m=+1090.524295260" watchObservedRunningTime="2026-02-18 06:05:53.877825226 +0000 UTC m=+1090.525784360" Feb 18 06:05:53 crc kubenswrapper[4707]: I0218 06:05:53.914180 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.914149366 podStartE2EDuration="6.914149366s" podCreationTimestamp="2026-02-18 06:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:05:53.899907624 +0000 UTC m=+1090.547866758" watchObservedRunningTime="2026-02-18 06:05:53.914149366 +0000 UTC m=+1090.562108500" Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.865366 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerID="bc680cffb8b9228f3585dc8539a3c1fc64b473ab07f48a5cedeeb4beabf17e79" exitCode=0 Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.866031 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerID="cc9fb961bf97a03dc9897ee5392ff906f7ab810a5a5158fb6791e7be27165dd1" exitCode=143 Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.865443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cc3d982-7089-42d5-84d6-d043a6fddbfe","Type":"ContainerDied","Data":"bc680cffb8b9228f3585dc8539a3c1fc64b473ab07f48a5cedeeb4beabf17e79"} Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.866202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cc3d982-7089-42d5-84d6-d043a6fddbfe","Type":"ContainerDied","Data":"cc9fb961bf97a03dc9897ee5392ff906f7ab810a5a5158fb6791e7be27165dd1"} Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.872278 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f8b563b-4b84-4a1e-8138-d733174bce8c" containerID="3a273d301b95ba0c04b5d4dfd0516aead8108ca6155655dfd482f7d65bf1f1c6" exitCode=0 Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.872331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9phtj" event={"ID":"5f8b563b-4b84-4a1e-8138-d733174bce8c","Type":"ContainerDied","Data":"3a273d301b95ba0c04b5d4dfd0516aead8108ca6155655dfd482f7d65bf1f1c6"} Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.876851 4707 generic.go:334] "Generic (PLEG): container finished" podID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerID="21a1e13e92cf035a710054e2d9234fe71fc6780b376905ee70fa92424ddcec77" exitCode=0 Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.876884 4707 generic.go:334] "Generic (PLEG): container finished" podID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerID="9a927c7e6a1dd384f92770bbefa0c1a08da8a90b861fac8757fdf67122c93d2a" exitCode=143 Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.877673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d33e59a5-30a9-421e-bff0-81b5fa831347","Type":"ContainerDied","Data":"21a1e13e92cf035a710054e2d9234fe71fc6780b376905ee70fa92424ddcec77"} Feb 18 06:05:54 crc kubenswrapper[4707]: I0218 06:05:54.877704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d33e59a5-30a9-421e-bff0-81b5fa831347","Type":"ContainerDied","Data":"9a927c7e6a1dd384f92770bbefa0c1a08da8a90b861fac8757fdf67122c93d2a"} Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.075069 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c594f6b9-4cz2s"] Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.113752 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-584c97fdd8-f4pbz"] Feb 18 06:05:57 crc kubenswrapper[4707]: E0218 06:05:57.114157 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396458cb-5b0d-4208-8340-00ed637c67a8" containerName="init" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.114169 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="396458cb-5b0d-4208-8340-00ed637c67a8" containerName="init" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.114367 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="396458cb-5b0d-4208-8340-00ed637c67a8" containerName="init" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.115289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.118468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.122873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-584c97fdd8-f4pbz"] Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.137943 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-tls-certs\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.138042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-config-data\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.138063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-scripts\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.138080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-secret-key\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.138107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-combined-ca-bundle\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.138152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe56b299-664f-478b-9f30-8e2a4c457676-logs\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.138205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q8ft\" (UniqueName: \"kubernetes.io/projected/fe56b299-664f-478b-9f30-8e2a4c457676-kube-api-access-9q8ft\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.188974 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d8cc4f5b5-jbjl5"] Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.225085 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77db99878b-h8xzs"] Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.226554 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.240839 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-horizon-secret-key\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.240891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q8ft\" (UniqueName: \"kubernetes.io/projected/fe56b299-664f-478b-9f30-8e2a4c457676-kube-api-access-9q8ft\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.240910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-scripts\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.240943 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-combined-ca-bundle\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.240964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-tls-certs\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.240989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-config-data\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-horizon-tls-certs\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-config-data\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-scripts\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-secret-key\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fsnp\" (UniqueName: \"kubernetes.io/projected/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-kube-api-access-8fsnp\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-logs\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-combined-ca-bundle\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe56b299-664f-478b-9f30-8e2a4c457676-logs\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.241553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe56b299-664f-478b-9f30-8e2a4c457676-logs\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.242358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-scripts\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.242636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-config-data\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.243270 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77db99878b-h8xzs"] Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.252417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-combined-ca-bundle\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.254185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-tls-certs\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.254952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-secret-key\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.266396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q8ft\" (UniqueName: \"kubernetes.io/projected/fe56b299-664f-478b-9f30-8e2a4c457676-kube-api-access-9q8ft\") pod \"horizon-584c97fdd8-f4pbz\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.342025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-horizon-tls-certs\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.342098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fsnp\" (UniqueName: \"kubernetes.io/projected/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-kube-api-access-8fsnp\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.342121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-logs\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.342175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-horizon-secret-key\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.342206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-scripts\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.342241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-combined-ca-bundle\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.342270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-config-data\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.343525 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-scripts\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.343682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-logs\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.344392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-config-data\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.347219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-horizon-tls-certs\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.355177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-horizon-secret-key\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.355339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-combined-ca-bundle\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.361918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fsnp\" (UniqueName: \"kubernetes.io/projected/6aa9efa8-e6b5-4307-89b1-8a67547a35e9-kube-api-access-8fsnp\") pod \"horizon-77db99878b-h8xzs\" (UID: \"6aa9efa8-e6b5-4307-89b1-8a67547a35e9\") " pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.440308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:05:57 crc kubenswrapper[4707]: I0218 06:05:57.543620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:05:58 crc kubenswrapper[4707]: I0218 06:05:58.790010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:05:58 crc kubenswrapper[4707]: I0218 06:05:58.859803 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bfwr6"] Feb 18 06:05:58 crc kubenswrapper[4707]: I0218 06:05:58.860053 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" containerID="cri-o://80fce60381ac9e24e7f596b7f229e4e673bb85287f11c60e3520be80c063d434" gracePeriod=10 Feb 18 06:05:59 crc kubenswrapper[4707]: I0218 06:05:59.940995 4707 generic.go:334] "Generic (PLEG): container finished" podID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerID="80fce60381ac9e24e7f596b7f229e4e673bb85287f11c60e3520be80c063d434" exitCode=0 Feb 18 06:05:59 crc kubenswrapper[4707]: I0218 06:05:59.941086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" event={"ID":"d14db420-fcd2-4cc6-b14e-0b75560e3207","Type":"ContainerDied","Data":"80fce60381ac9e24e7f596b7f229e4e673bb85287f11c60e3520be80c063d434"} Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.171913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.219631 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.298347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-659c7\" (UniqueName: \"kubernetes.io/projected/5f8b563b-4b84-4a1e-8138-d733174bce8c-kube-api-access-659c7\") pod \"5f8b563b-4b84-4a1e-8138-d733174bce8c\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.298397 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-combined-ca-bundle\") pod \"5f8b563b-4b84-4a1e-8138-d733174bce8c\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.298465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-credential-keys\") pod \"5f8b563b-4b84-4a1e-8138-d733174bce8c\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.298484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-scripts\") pod \"5f8b563b-4b84-4a1e-8138-d733174bce8c\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.298745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-config-data\") pod \"5f8b563b-4b84-4a1e-8138-d733174bce8c\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.298787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-fernet-keys\") pod \"5f8b563b-4b84-4a1e-8138-d733174bce8c\" (UID: \"5f8b563b-4b84-4a1e-8138-d733174bce8c\") " Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.307583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8b563b-4b84-4a1e-8138-d733174bce8c-kube-api-access-659c7" (OuterVolumeSpecName: "kube-api-access-659c7") pod "5f8b563b-4b84-4a1e-8138-d733174bce8c" (UID: "5f8b563b-4b84-4a1e-8138-d733174bce8c"). InnerVolumeSpecName "kube-api-access-659c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.311649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5f8b563b-4b84-4a1e-8138-d733174bce8c" (UID: "5f8b563b-4b84-4a1e-8138-d733174bce8c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.331113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-scripts" (OuterVolumeSpecName: "scripts") pod "5f8b563b-4b84-4a1e-8138-d733174bce8c" (UID: "5f8b563b-4b84-4a1e-8138-d733174bce8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.333052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5f8b563b-4b84-4a1e-8138-d733174bce8c" (UID: "5f8b563b-4b84-4a1e-8138-d733174bce8c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.335736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-config-data" (OuterVolumeSpecName: "config-data") pod "5f8b563b-4b84-4a1e-8138-d733174bce8c" (UID: "5f8b563b-4b84-4a1e-8138-d733174bce8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.341498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f8b563b-4b84-4a1e-8138-d733174bce8c" (UID: "5f8b563b-4b84-4a1e-8138-d733174bce8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.401404 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.401452 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-659c7\" (UniqueName: \"kubernetes.io/projected/5f8b563b-4b84-4a1e-8138-d733174bce8c-kube-api-access-659c7\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.401466 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.401477 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.401487 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.401494 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8b563b-4b84-4a1e-8138-d733174bce8c-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.960100 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9phtj" event={"ID":"5f8b563b-4b84-4a1e-8138-d733174bce8c","Type":"ContainerDied","Data":"bac7ce3a159130b516bc2854b246ac081f8f40c89838ae489279310fbe5ab978"} Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.960147 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bac7ce3a159130b516bc2854b246ac081f8f40c89838ae489279310fbe5ab978" Feb 18 06:06:00 crc kubenswrapper[4707]: I0218 06:06:00.960179 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9phtj" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.282051 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9phtj"] Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.289589 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9phtj"] Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.394311 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5qczm"] Feb 18 06:06:01 crc kubenswrapper[4707]: E0218 06:06:01.394960 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8b563b-4b84-4a1e-8138-d733174bce8c" containerName="keystone-bootstrap" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.394980 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8b563b-4b84-4a1e-8138-d733174bce8c" containerName="keystone-bootstrap" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.395238 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8b563b-4b84-4a1e-8138-d733174bce8c" containerName="keystone-bootstrap" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.396710 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.400749 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.401052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.402182 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.402316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lqdm4" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.402452 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.412174 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5qczm"] Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.527348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-scripts\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.527452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-combined-ca-bundle\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.527498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-config-data\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.527567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-credential-keys\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.527629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98dpg\" (UniqueName: \"kubernetes.io/projected/a2835aa4-2296-4af8-b099-af250380f599-kube-api-access-98dpg\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.527650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-fernet-keys\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.629615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-credential-keys\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.630178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98dpg\" (UniqueName: \"kubernetes.io/projected/a2835aa4-2296-4af8-b099-af250380f599-kube-api-access-98dpg\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.630205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-fernet-keys\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.630288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-scripts\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.630310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-combined-ca-bundle\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.630361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-config-data\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.634952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-config-data\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.635545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-fernet-keys\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.635597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-credential-keys\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.643224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-scripts\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.651832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-combined-ca-bundle\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.660124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98dpg\" (UniqueName: \"kubernetes.io/projected/a2835aa4-2296-4af8-b099-af250380f599-kube-api-access-98dpg\") pod \"keystone-bootstrap-5qczm\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:01 crc kubenswrapper[4707]: I0218 06:06:01.722338 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:02 crc kubenswrapper[4707]: I0218 06:06:02.063315 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8b563b-4b84-4a1e-8138-d733174bce8c" path="/var/lib/kubelet/pods/5f8b563b-4b84-4a1e-8138-d733174bce8c/volumes" Feb 18 06:06:04 crc kubenswrapper[4707]: E0218 06:06:04.981188 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 18 06:06:04 crc kubenswrapper[4707]: E0218 06:06:04.981983 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch64bh648h77h67dh54bhd9h55bh577h556h94h666h59h5bdh5b4h5fch57h576h56h5cch67bh57h5ch547h566h7bh7dh5ddh558h57bh88h6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-czwfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8c594f6b9-4cz2s_openstack(0625e994-776d-4baf-a5b8-557b0ec5e11e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:06:04 crc kubenswrapper[4707]: E0218 06:06:04.985019 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8c594f6b9-4cz2s" podUID="0625e994-776d-4baf-a5b8-557b0ec5e11e" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.021039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cc3d982-7089-42d5-84d6-d043a6fddbfe","Type":"ContainerDied","Data":"664398e684beea5b317daa1d6070644676d48c3429823a58c9799dcf39c0b662"} Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.021135 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664398e684beea5b317daa1d6070644676d48c3429823a58c9799dcf39c0b662" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.024015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d33e59a5-30a9-421e-bff0-81b5fa831347","Type":"ContainerDied","Data":"74267fe7baa2484b92d134d97da6deeede50cacefb86a30ded2061ab8db4036f"} Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.025165 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74267fe7baa2484b92d134d97da6deeede50cacefb86a30ded2061ab8db4036f" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.069791 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.074190 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-ceph\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100257 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fjc7\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-kube-api-access-2fjc7\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g9tw\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-kube-api-access-9g9tw\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-public-tls-certs\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-config-data\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-combined-ca-bundle\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-logs\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-scripts\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-ceph\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-logs\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-scripts\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-config-data\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-httpd-run\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-internal-tls-certs\") pod \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\" (UID: \"9cc3d982-7089-42d5-84d6-d043a6fddbfe\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-combined-ca-bundle\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.100713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-httpd-run\") pod \"d33e59a5-30a9-421e-bff0-81b5fa831347\" (UID: \"d33e59a5-30a9-421e-bff0-81b5fa831347\") " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.114181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.115116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.117257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.118621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-logs" (OuterVolumeSpecName: "logs") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.120546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-logs" (OuterVolumeSpecName: "logs") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.124714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-kube-api-access-9g9tw" (OuterVolumeSpecName: "kube-api-access-9g9tw") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "kube-api-access-9g9tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.128012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.128047 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-scripts" (OuterVolumeSpecName: "scripts") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.129275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-kube-api-access-2fjc7" (OuterVolumeSpecName: "kube-api-access-2fjc7") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "kube-api-access-2fjc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.130067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-ceph" (OuterVolumeSpecName: "ceph") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.141942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-scripts" (OuterVolumeSpecName: "scripts") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.154222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-ceph" (OuterVolumeSpecName: "ceph") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.169619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.185919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202870 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202907 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202918 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202927 4707 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-ceph\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202936 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fjc7\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-kube-api-access-2fjc7\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202946 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g9tw\" (UniqueName: \"kubernetes.io/projected/d33e59a5-30a9-421e-bff0-81b5fa831347-kube-api-access-9g9tw\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202975 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202984 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.202993 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d33e59a5-30a9-421e-bff0-81b5fa831347-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.203002 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.203010 4707 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cc3d982-7089-42d5-84d6-d043a6fddbfe-ceph\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.203018 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cc3d982-7089-42d5-84d6-d043a6fddbfe-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.203031 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.203040 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.209111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.211603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.214064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-config-data" (OuterVolumeSpecName: "config-data") pod "d33e59a5-30a9-421e-bff0-81b5fa831347" (UID: "d33e59a5-30a9-421e-bff0-81b5fa831347"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.219385 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.229413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-config-data" (OuterVolumeSpecName: "config-data") pod "9cc3d982-7089-42d5-84d6-d043a6fddbfe" (UID: "9cc3d982-7089-42d5-84d6-d043a6fddbfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.229730 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.233805 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.307034 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.307072 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.307087 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.307098 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.307112 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d33e59a5-30a9-421e-bff0-81b5fa831347-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:05 crc kubenswrapper[4707]: I0218 06:06:05.307121 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc3d982-7089-42d5-84d6-d043a6fddbfe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.033069 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.033158 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.089294 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.119118 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.135472 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.143458 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.150322 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: E0218 06:06:06.151101 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-log" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151142 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-log" Feb 18 06:06:06 crc kubenswrapper[4707]: E0218 06:06:06.151174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-httpd" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151184 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-httpd" Feb 18 06:06:06 crc kubenswrapper[4707]: E0218 06:06:06.151197 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-httpd" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151207 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-httpd" Feb 18 06:06:06 crc kubenswrapper[4707]: E0218 06:06:06.151239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-log" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-log" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151521 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-httpd" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151554 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" containerName="glance-log" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151575 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-log" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.151585 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" containerName="glance-httpd" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.153182 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.156926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.157669 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.157903 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.158049 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9r2t2" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.158267 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.158622 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.160035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.162204 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.163339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.178363 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.216739 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:06 crc kubenswrapper[4707]: E0218 06:06:06.329191 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd33e59a5_30a9_421e_bff0_81b5fa831347.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd33e59a5_30a9_421e_bff0_81b5fa831347.slice/crio-74267fe7baa2484b92d134d97da6deeede50cacefb86a30ded2061ab8db4036f\": RecentStats: unable to find data in memory cache]" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-ceph\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-logs\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqbg6\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-kube-api-access-tqbg6\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330773 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-logs\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330837 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.330997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.331026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkd7\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-kube-api-access-pgkd7\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.331062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.331098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqbg6\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-kube-api-access-tqbg6\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-logs\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkd7\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-kube-api-access-pgkd7\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.434975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.435006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.435026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-ceph\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.435052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-logs\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.435074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.437364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.441934 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.443548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.445084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-logs\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.445530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-logs\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.445803 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.449694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.469515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.470569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.472272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.473188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.477190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.477638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-ceph\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.480045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.480300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.480547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.488641 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkd7\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-kube-api-access-pgkd7\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.495605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqbg6\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-kube-api-access-tqbg6\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.517765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " pod="openstack/glance-default-external-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.528604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.787506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:06 crc kubenswrapper[4707]: I0218 06:06:06.806227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:06:08 crc kubenswrapper[4707]: I0218 06:06:08.068364 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc3d982-7089-42d5-84d6-d043a6fddbfe" path="/var/lib/kubelet/pods/9cc3d982-7089-42d5-84d6-d043a6fddbfe/volumes" Feb 18 06:06:08 crc kubenswrapper[4707]: I0218 06:06:08.070442 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33e59a5-30a9-421e-bff0-81b5fa831347" path="/var/lib/kubelet/pods/d33e59a5-30a9-421e-bff0-81b5fa831347/volumes" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.113774 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.113977 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lbrw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-w2p4x_openstack(0301370e-0d52-4549-93ba-033d6d706508): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.115197 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-w2p4x" podUID="0301370e-0d52-4549-93ba-033d6d706508" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.134267 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.134435 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n647h96h698h65dh68dh7fhb5h676h5b8h56fh5c8h647h5b8h5dchfch5d4h54ch7h649h589h545h6h697h677h6dhd9h66chfh5dchb8h89h655q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82lmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5d8cc4f5b5-jbjl5_openstack(f21bf944-9893-43da-99fc-5af82d67a34b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.136393 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.136548 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64chd7hd9hb4h8dh76h574h78h87h675h679h5c5h68dh68dh54dh678hc8h54fhf7h87h677h94h5fbhch7bh56fhcdh5d9hcdh546h8bh5dfq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w44pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-648fb46955-dwn6p_openstack(2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.137436 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5d8cc4f5b5-jbjl5" podUID="f21bf944-9893-43da-99fc-5af82d67a34b" Feb 18 06:06:08 crc kubenswrapper[4707]: E0218 06:06:08.139999 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-648fb46955-dwn6p" podUID="2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e" Feb 18 06:06:09 crc kubenswrapper[4707]: I0218 06:06:09.070559 4707 generic.go:334] "Generic (PLEG): container finished" podID="668b58a1-6a64-4356-a00f-ccf6faa1ce3b" containerID="3a93156ee1b4f3e91620c4bde12236aa1e4ef2c92c4112e24d4e0a8a2e806da6" exitCode=0 Feb 18 06:06:09 crc kubenswrapper[4707]: I0218 06:06:09.070680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t45dj" event={"ID":"668b58a1-6a64-4356-a00f-ccf6faa1ce3b","Type":"ContainerDied","Data":"3a93156ee1b4f3e91620c4bde12236aa1e4ef2c92c4112e24d4e0a8a2e806da6"} Feb 18 06:06:09 crc kubenswrapper[4707]: E0218 06:06:09.072051 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-w2p4x" podUID="0301370e-0d52-4549-93ba-033d6d706508" Feb 18 06:06:10 crc kubenswrapper[4707]: I0218 06:06:10.219839 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Feb 18 06:06:10 crc kubenswrapper[4707]: I0218 06:06:10.221848 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:06:15 crc kubenswrapper[4707]: I0218 06:06:15.219827 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Feb 18 06:06:16 crc kubenswrapper[4707]: E0218 06:06:16.245660 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 18 06:06:16 crc kubenswrapper[4707]: E0218 06:06:16.246149 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zz2kr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-bpqnj_openstack(3f8ab203-7caa-4df6-9c0c-599a9d1b9612): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:06:16 crc kubenswrapper[4707]: E0218 06:06:16.247895 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-bpqnj" podUID="3f8ab203-7caa-4df6-9c0c-599a9d1b9612" Feb 18 06:06:17 crc kubenswrapper[4707]: E0218 06:06:17.163562 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-bpqnj" podUID="3f8ab203-7caa-4df6-9c0c-599a9d1b9612" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.760386 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.765392 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.773272 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.779880 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t45dj" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.847968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0625e994-776d-4baf-a5b8-557b0ec5e11e-horizon-secret-key\") pod \"0625e994-776d-4baf-a5b8-557b0ec5e11e\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0625e994-776d-4baf-a5b8-557b0ec5e11e-logs\") pod \"0625e994-776d-4baf-a5b8-557b0ec5e11e\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848060 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82lmz\" (UniqueName: \"kubernetes.io/projected/f21bf944-9893-43da-99fc-5af82d67a34b-kube-api-access-82lmz\") pod \"f21bf944-9893-43da-99fc-5af82d67a34b\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-logs\") pod \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-scripts\") pod \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czwfv\" (UniqueName: \"kubernetes.io/projected/0625e994-776d-4baf-a5b8-557b0ec5e11e-kube-api-access-czwfv\") pod \"0625e994-776d-4baf-a5b8-557b0ec5e11e\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-config-data\") pod \"f21bf944-9893-43da-99fc-5af82d67a34b\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-scripts\") pod \"0625e994-776d-4baf-a5b8-557b0ec5e11e\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f21bf944-9893-43da-99fc-5af82d67a34b-logs\") pod \"f21bf944-9893-43da-99fc-5af82d67a34b\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-config-data\") pod \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-scripts\") pod \"f21bf944-9893-43da-99fc-5af82d67a34b\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f21bf944-9893-43da-99fc-5af82d67a34b-horizon-secret-key\") pod \"f21bf944-9893-43da-99fc-5af82d67a34b\" (UID: \"f21bf944-9893-43da-99fc-5af82d67a34b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-config-data\") pod \"0625e994-776d-4baf-a5b8-557b0ec5e11e\" (UID: \"0625e994-776d-4baf-a5b8-557b0ec5e11e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-horizon-secret-key\") pod \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.848439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w44pb\" (UniqueName: \"kubernetes.io/projected/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-kube-api-access-w44pb\") pod \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\" (UID: \"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.849545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-scripts" (OuterVolumeSpecName: "scripts") pod "f21bf944-9893-43da-99fc-5af82d67a34b" (UID: "f21bf944-9893-43da-99fc-5af82d67a34b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.849705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-config-data" (OuterVolumeSpecName: "config-data") pod "f21bf944-9893-43da-99fc-5af82d67a34b" (UID: "f21bf944-9893-43da-99fc-5af82d67a34b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.850223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-logs" (OuterVolumeSpecName: "logs") pod "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e" (UID: "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.850700 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-scripts" (OuterVolumeSpecName: "scripts") pod "0625e994-776d-4baf-a5b8-557b0ec5e11e" (UID: "0625e994-776d-4baf-a5b8-557b0ec5e11e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.850858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-config-data" (OuterVolumeSpecName: "config-data") pod "0625e994-776d-4baf-a5b8-557b0ec5e11e" (UID: "0625e994-776d-4baf-a5b8-557b0ec5e11e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.851034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21bf944-9893-43da-99fc-5af82d67a34b-logs" (OuterVolumeSpecName: "logs") pod "f21bf944-9893-43da-99fc-5af82d67a34b" (UID: "f21bf944-9893-43da-99fc-5af82d67a34b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.851233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-scripts" (OuterVolumeSpecName: "scripts") pod "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e" (UID: "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.851246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0625e994-776d-4baf-a5b8-557b0ec5e11e-logs" (OuterVolumeSpecName: "logs") pod "0625e994-776d-4baf-a5b8-557b0ec5e11e" (UID: "0625e994-776d-4baf-a5b8-557b0ec5e11e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.852018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-config-data" (OuterVolumeSpecName: "config-data") pod "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e" (UID: "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.853824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-kube-api-access-w44pb" (OuterVolumeSpecName: "kube-api-access-w44pb") pod "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e" (UID: "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e"). InnerVolumeSpecName "kube-api-access-w44pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.853849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21bf944-9893-43da-99fc-5af82d67a34b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f21bf944-9893-43da-99fc-5af82d67a34b" (UID: "f21bf944-9893-43da-99fc-5af82d67a34b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.854146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0625e994-776d-4baf-a5b8-557b0ec5e11e-kube-api-access-czwfv" (OuterVolumeSpecName: "kube-api-access-czwfv") pod "0625e994-776d-4baf-a5b8-557b0ec5e11e" (UID: "0625e994-776d-4baf-a5b8-557b0ec5e11e"). InnerVolumeSpecName "kube-api-access-czwfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.855774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e" (UID: "2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.855792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21bf944-9893-43da-99fc-5af82d67a34b-kube-api-access-82lmz" (OuterVolumeSpecName: "kube-api-access-82lmz") pod "f21bf944-9893-43da-99fc-5af82d67a34b" (UID: "f21bf944-9893-43da-99fc-5af82d67a34b"). InnerVolumeSpecName "kube-api-access-82lmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.856292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0625e994-776d-4baf-a5b8-557b0ec5e11e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0625e994-776d-4baf-a5b8-557b0ec5e11e" (UID: "0625e994-776d-4baf-a5b8-557b0ec5e11e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-config\") pod \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-combined-ca-bundle\") pod \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9gcf\" (UniqueName: \"kubernetes.io/projected/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-kube-api-access-h9gcf\") pod \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\" (UID: \"668b58a1-6a64-4356-a00f-ccf6faa1ce3b\") " Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950936 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950954 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950964 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w44pb\" (UniqueName: \"kubernetes.io/projected/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-kube-api-access-w44pb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950973 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0625e994-776d-4baf-a5b8-557b0ec5e11e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950982 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0625e994-776d-4baf-a5b8-557b0ec5e11e-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.950991 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82lmz\" (UniqueName: \"kubernetes.io/projected/f21bf944-9893-43da-99fc-5af82d67a34b-kube-api-access-82lmz\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951000 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951007 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951015 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czwfv\" (UniqueName: \"kubernetes.io/projected/0625e994-776d-4baf-a5b8-557b0ec5e11e-kube-api-access-czwfv\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951023 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951031 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0625e994-776d-4baf-a5b8-557b0ec5e11e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951038 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f21bf944-9893-43da-99fc-5af82d67a34b-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951046 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951054 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f21bf944-9893-43da-99fc-5af82d67a34b-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.951061 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f21bf944-9893-43da-99fc-5af82d67a34b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.954645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-kube-api-access-h9gcf" (OuterVolumeSpecName: "kube-api-access-h9gcf") pod "668b58a1-6a64-4356-a00f-ccf6faa1ce3b" (UID: "668b58a1-6a64-4356-a00f-ccf6faa1ce3b"). InnerVolumeSpecName "kube-api-access-h9gcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.972488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-config" (OuterVolumeSpecName: "config") pod "668b58a1-6a64-4356-a00f-ccf6faa1ce3b" (UID: "668b58a1-6a64-4356-a00f-ccf6faa1ce3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:17 crc kubenswrapper[4707]: I0218 06:06:17.978346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "668b58a1-6a64-4356-a00f-ccf6faa1ce3b" (UID: "668b58a1-6a64-4356-a00f-ccf6faa1ce3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.052882 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.052941 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.052970 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9gcf\" (UniqueName: \"kubernetes.io/projected/668b58a1-6a64-4356-a00f-ccf6faa1ce3b-kube-api-access-h9gcf\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.170867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t45dj" event={"ID":"668b58a1-6a64-4356-a00f-ccf6faa1ce3b","Type":"ContainerDied","Data":"30b31f6d120cebfd4d3ffd0a9b5744c97f4501c8829918091e42a179ae66aa6f"} Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.170940 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30b31f6d120cebfd4d3ffd0a9b5744c97f4501c8829918091e42a179ae66aa6f" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.170903 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t45dj" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.171861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648fb46955-dwn6p" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.171908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-648fb46955-dwn6p" event={"ID":"2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e","Type":"ContainerDied","Data":"14107fc38e75d6471edc68b9ba4d96842af92920d1947711f938237327b75f47"} Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.174279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c594f6b9-4cz2s" event={"ID":"0625e994-776d-4baf-a5b8-557b0ec5e11e","Type":"ContainerDied","Data":"18e90572167acff79624676fb616b331e27cc7f0a0dfc135a01f5615c7aff2b0"} Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.174305 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c594f6b9-4cz2s" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.175569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d8cc4f5b5-jbjl5" event={"ID":"f21bf944-9893-43da-99fc-5af82d67a34b","Type":"ContainerDied","Data":"e8d9f397f6ecbaad4e93ec91adc4ae059cbd3f16b1d1ee7a1c38560fa5c4f7d9"} Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.175630 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8cc4f5b5-jbjl5" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.239984 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-648fb46955-dwn6p"] Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.262017 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-648fb46955-dwn6p"] Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.279219 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d8cc4f5b5-jbjl5"] Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.285642 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d8cc4f5b5-jbjl5"] Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.299142 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c594f6b9-4cz2s"] Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.305784 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8c594f6b9-4cz2s"] Feb 18 06:06:18 crc kubenswrapper[4707]: E0218 06:06:18.818318 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 18 06:06:18 crc kubenswrapper[4707]: E0218 06:06:18.818480 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4267g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-4kl7b_openstack(e5f975a7-ca3d-4940-9281-051360f67955): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:06:18 crc kubenswrapper[4707]: E0218 06:06:18.819664 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-4kl7b" podUID="e5f975a7-ca3d-4940-9281-051360f67955" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.863147 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.967647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlrb\" (UniqueName: \"kubernetes.io/projected/d14db420-fcd2-4cc6-b14e-0b75560e3207-kube-api-access-qrlrb\") pod \"d14db420-fcd2-4cc6-b14e-0b75560e3207\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.967692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-dns-svc\") pod \"d14db420-fcd2-4cc6-b14e-0b75560e3207\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.967748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-config\") pod \"d14db420-fcd2-4cc6-b14e-0b75560e3207\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.967812 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-nb\") pod \"d14db420-fcd2-4cc6-b14e-0b75560e3207\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.967881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-sb\") pod \"d14db420-fcd2-4cc6-b14e-0b75560e3207\" (UID: \"d14db420-fcd2-4cc6-b14e-0b75560e3207\") " Feb 18 06:06:18 crc kubenswrapper[4707]: I0218 06:06:18.977710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14db420-fcd2-4cc6-b14e-0b75560e3207-kube-api-access-qrlrb" (OuterVolumeSpecName: "kube-api-access-qrlrb") pod "d14db420-fcd2-4cc6-b14e-0b75560e3207" (UID: "d14db420-fcd2-4cc6-b14e-0b75560e3207"). InnerVolumeSpecName "kube-api-access-qrlrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.074625 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrlrb\" (UniqueName: \"kubernetes.io/projected/d14db420-fcd2-4cc6-b14e-0b75560e3207-kube-api-access-qrlrb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.081542 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9chfr"] Feb 18 06:06:19 crc kubenswrapper[4707]: E0218 06:06:19.081931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.081947 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" Feb 18 06:06:19 crc kubenswrapper[4707]: E0218 06:06:19.081971 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="init" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.081977 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="init" Feb 18 06:06:19 crc kubenswrapper[4707]: E0218 06:06:19.081990 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668b58a1-6a64-4356-a00f-ccf6faa1ce3b" containerName="neutron-db-sync" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.081996 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="668b58a1-6a64-4356-a00f-ccf6faa1ce3b" containerName="neutron-db-sync" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.082142 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="668b58a1-6a64-4356-a00f-ccf6faa1ce3b" containerName="neutron-db-sync" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.082162 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" containerName="dnsmasq-dns" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.083071 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.083083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d14db420-fcd2-4cc6-b14e-0b75560e3207" (UID: "d14db420-fcd2-4cc6-b14e-0b75560e3207"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.109081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d14db420-fcd2-4cc6-b14e-0b75560e3207" (UID: "d14db420-fcd2-4cc6-b14e-0b75560e3207"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.118327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9chfr"] Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.122530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-config" (OuterVolumeSpecName: "config") pod "d14db420-fcd2-4cc6-b14e-0b75560e3207" (UID: "d14db420-fcd2-4cc6-b14e-0b75560e3207"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.129517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d14db420-fcd2-4cc6-b14e-0b75560e3207" (UID: "d14db420-fcd2-4cc6-b14e-0b75560e3207"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-config\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2zzm\" (UniqueName: \"kubernetes.io/projected/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-kube-api-access-n2zzm\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178401 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178418 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178432 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.178443 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d14db420-fcd2-4cc6-b14e-0b75560e3207-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.196367 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.198003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bfwr6" event={"ID":"d14db420-fcd2-4cc6-b14e-0b75560e3207","Type":"ContainerDied","Data":"5a474bf4657e1233ac180301a059610b7989a1eefed7e11233c5d6db9f8b33e6"} Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.198072 4707 scope.go:117] "RemoveContainer" containerID="80fce60381ac9e24e7f596b7f229e4e673bb85287f11c60e3520be80c063d434" Feb 18 06:06:19 crc kubenswrapper[4707]: E0218 06:06:19.200065 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-4kl7b" podUID="e5f975a7-ca3d-4940-9281-051360f67955" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.202453 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-767657c548-8lcwd"] Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.206392 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.209246 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.210726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.211101 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.215574 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ncszv" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.226237 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-767657c548-8lcwd"] Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.264052 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bfwr6"] Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.277406 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bfwr6"] Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.279975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-config\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-httpd-config\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2zzm\" (UniqueName: \"kubernetes.io/projected/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-kube-api-access-n2zzm\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-ovndb-tls-certs\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-combined-ca-bundle\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmm9k\" (UniqueName: \"kubernetes.io/projected/c861568d-c8d4-49f4-9e5d-524c9445b152-kube-api-access-zmm9k\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.280276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-config\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.281282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.281289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-config\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.282391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.282452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.282571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.299475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2zzm\" (UniqueName: \"kubernetes.io/projected/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-kube-api-access-n2zzm\") pod \"dnsmasq-dns-5ccc5c4795-9chfr\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.382106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-httpd-config\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.382162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-ovndb-tls-certs\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.382236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-combined-ca-bundle\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.382256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmm9k\" (UniqueName: \"kubernetes.io/projected/c861568d-c8d4-49f4-9e5d-524c9445b152-kube-api-access-zmm9k\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.382273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-config\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.386082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-combined-ca-bundle\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.386155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-config\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.387082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-httpd-config\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.400615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-ovndb-tls-certs\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.404302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmm9k\" (UniqueName: \"kubernetes.io/projected/c861568d-c8d4-49f4-9e5d-524c9445b152-kube-api-access-zmm9k\") pod \"neutron-767657c548-8lcwd\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.431056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.529554 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:19 crc kubenswrapper[4707]: E0218 06:06:19.710257 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 18 06:06:19 crc kubenswrapper[4707]: E0218 06:06:19.710453 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n549h58bh9bh67ch65bh697h5f7h56dh674h5ddh98h649h6hdfh685hffh9dh8h669hb4h585h6dh88hf8h549h9bhbdhd5h574h74h5cdh679q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ggsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(835f51c5-3996-4120-bd2f-4b3bef33c31d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:06:19 crc kubenswrapper[4707]: I0218 06:06:19.823537 4707 scope.go:117] "RemoveContainer" containerID="f37ce5589e05030213c59c072103c26e7e8b8c37db00688e5e4a03a714f05fe5" Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.080563 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0625e994-776d-4baf-a5b8-557b0ec5e11e" path="/var/lib/kubelet/pods/0625e994-776d-4baf-a5b8-557b0ec5e11e/volumes" Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.081416 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e" path="/var/lib/kubelet/pods/2d6a6d13-c7e4-45ae-8fd1-b9f7e440b05e/volumes" Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.081782 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14db420-fcd2-4cc6-b14e-0b75560e3207" path="/var/lib/kubelet/pods/d14db420-fcd2-4cc6-b14e-0b75560e3207/volumes" Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.082555 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21bf944-9893-43da-99fc-5af82d67a34b" path="/var/lib/kubelet/pods/f21bf944-9893-43da-99fc-5af82d67a34b/volumes" Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.210587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4gf8v" event={"ID":"27b6874e-b7bf-4d36-8ea1-66cf492ba327","Type":"ContainerStarted","Data":"1ebbb7fc9bc31795cd57981b0878f89911fcc659b97fd197bce36c9a99dac95d"} Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.234430 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4gf8v" podStartSLOduration=4.499574698 podStartE2EDuration="32.234413494s" podCreationTimestamp="2026-02-18 06:05:48 +0000 UTC" firstStartedPulling="2026-02-18 06:05:49.930972059 +0000 UTC m=+1086.578931193" lastFinishedPulling="2026-02-18 06:06:17.665810855 +0000 UTC m=+1114.313769989" observedRunningTime="2026-02-18 06:06:20.230290644 +0000 UTC m=+1116.878249778" watchObservedRunningTime="2026-02-18 06:06:20.234413494 +0000 UTC m=+1116.882372628" Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.322106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77db99878b-h8xzs"] Feb 18 06:06:20 crc kubenswrapper[4707]: W0218 06:06:20.327710 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aa9efa8_e6b5_4307_89b1_8a67547a35e9.slice/crio-c1af35a9c1849a321724072b794521e8acee239c2577f1f3e2f2153fbede1ba8 WatchSource:0}: Error finding container c1af35a9c1849a321724072b794521e8acee239c2577f1f3e2f2153fbede1ba8: Status 404 returned error can't find the container with id c1af35a9c1849a321724072b794521e8acee239c2577f1f3e2f2153fbede1ba8 Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.331396 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5qczm"] Feb 18 06:06:20 crc kubenswrapper[4707]: W0218 06:06:20.333180 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2835aa4_2296_4af8_b099_af250380f599.slice/crio-1ea02a917bc8b7d9117a01eb02dd8f7a1b49c57f6925a6c2d6b693919c3eb3b9 WatchSource:0}: Error finding container 1ea02a917bc8b7d9117a01eb02dd8f7a1b49c57f6925a6c2d6b693919c3eb3b9: Status 404 returned error can't find the container with id 1ea02a917bc8b7d9117a01eb02dd8f7a1b49c57f6925a6c2d6b693919c3eb3b9 Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.337660 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-584c97fdd8-f4pbz"] Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.409090 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:06:20 crc kubenswrapper[4707]: W0218 06:06:20.412878 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9656e2a3_692a_44cb_9260_2b3ae4227e82.slice/crio-0cf9b2c7f9e2b80088d9959c94b1db4dcd925375e000ae341e55d92acdbb6468 WatchSource:0}: Error finding container 0cf9b2c7f9e2b80088d9959c94b1db4dcd925375e000ae341e55d92acdbb6468: Status 404 returned error can't find the container with id 0cf9b2c7f9e2b80088d9959c94b1db4dcd925375e000ae341e55d92acdbb6468 Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.564138 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9chfr"] Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.585436 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:06:20 crc kubenswrapper[4707]: I0218 06:06:20.665041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-767657c548-8lcwd"] Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.168228 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59d57d6969-cr5bz"] Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.169856 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.172722 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.175142 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.227056 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59d57d6969-cr5bz"] Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.239180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-internal-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.239256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-combined-ca-bundle\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.239368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqrw\" (UniqueName: \"kubernetes.io/projected/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-kube-api-access-qjqrw\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.239464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-httpd-config\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.239674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-ovndb-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.242288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-public-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.242326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-config\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.269375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" event={"ID":"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe","Type":"ContainerStarted","Data":"badd193811f65e9e27c883f2e29d286f816ac03e0324fdc593271c3934ddb7f5"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.274557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767657c548-8lcwd" event={"ID":"c861568d-c8d4-49f4-9e5d-524c9445b152","Type":"ContainerStarted","Data":"913160af363ffd3cf1aee395a56c62091c2624d4ebdd1bfbc06f99c60292676e"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.281364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77db99878b-h8xzs" event={"ID":"6aa9efa8-e6b5-4307-89b1-8a67547a35e9","Type":"ContainerStarted","Data":"c1af35a9c1849a321724072b794521e8acee239c2577f1f3e2f2153fbede1ba8"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.284845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9656e2a3-692a-44cb-9260-2b3ae4227e82","Type":"ContainerStarted","Data":"fd117981ca664788b78f8d24fb317f8453f05eab09bfa2489d6ea42e977fa597"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.284906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9656e2a3-692a-44cb-9260-2b3ae4227e82","Type":"ContainerStarted","Data":"0cf9b2c7f9e2b80088d9959c94b1db4dcd925375e000ae341e55d92acdbb6468"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.286446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5qczm" event={"ID":"a2835aa4-2296-4af8-b099-af250380f599","Type":"ContainerStarted","Data":"5a7d6a439a0870e8147b64adeac25288771e67e2e5bbb4437da0e31d0cb51c02"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.286479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5qczm" event={"ID":"a2835aa4-2296-4af8-b099-af250380f599","Type":"ContainerStarted","Data":"1ea02a917bc8b7d9117a01eb02dd8f7a1b49c57f6925a6c2d6b693919c3eb3b9"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.288231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584c97fdd8-f4pbz" event={"ID":"fe56b299-664f-478b-9f30-8e2a4c457676","Type":"ContainerStarted","Data":"328aa4e2b64435ab36f682f1e872ec7c28fcc7d0b47c80ed7774a9407077eba1"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.292123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2","Type":"ContainerStarted","Data":"3f5b48772cc4ee6ef3792d3b3e3bd66ce34dfa027fc252a44738ef41b381da55"} Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.314534 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5qczm" podStartSLOduration=20.314497941 podStartE2EDuration="20.314497941s" podCreationTimestamp="2026-02-18 06:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:21.310495244 +0000 UTC m=+1117.958454378" watchObservedRunningTime="2026-02-18 06:06:21.314497941 +0000 UTC m=+1117.962457075" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.344586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-ovndb-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.344627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-public-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.344644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-config\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.344681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-internal-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.344700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-combined-ca-bundle\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.344739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqrw\" (UniqueName: \"kubernetes.io/projected/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-kube-api-access-qjqrw\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.344759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-httpd-config\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.350625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-combined-ca-bundle\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.350809 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-ovndb-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.352074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-httpd-config\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.352656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-public-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.362129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-internal-tls-certs\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.363336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-config\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.363713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqrw\" (UniqueName: \"kubernetes.io/projected/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-kube-api-access-qjqrw\") pod \"neutron-59d57d6969-cr5bz\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.382332 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.382389 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.382439 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.383107 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27b00527a1a2dd19572cd4b34eeb62edfbb26a8ff621d8e0ad7b7a217cf69cd3"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.383158 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://27b00527a1a2dd19572cd4b34eeb62edfbb26a8ff621d8e0ad7b7a217cf69cd3" gracePeriod=600 Feb 18 06:06:21 crc kubenswrapper[4707]: I0218 06:06:21.531692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.185057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59d57d6969-cr5bz"] Feb 18 06:06:22 crc kubenswrapper[4707]: W0218 06:06:22.198779 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeada398c_c6c5_4cbe_b221_1c3461fa8cd8.slice/crio-46b18c9a9a20ed28c742f389d1f7de58b7a33d5ca5ec209c8e73d3c052cd0a55 WatchSource:0}: Error finding container 46b18c9a9a20ed28c742f389d1f7de58b7a33d5ca5ec209c8e73d3c052cd0a55: Status 404 returned error can't find the container with id 46b18c9a9a20ed28c742f389d1f7de58b7a33d5ca5ec209c8e73d3c052cd0a55 Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.313569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2","Type":"ContainerStarted","Data":"774ed4ddbb9a9aa8cb8b1341283c9576f6ec331e3eba7306eea0e6e212f6aa06"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.321781 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerID="e37a9ab49c4ce221b61837a42416bb6a3ef73d7b13c3090449465be7a843e5c1" exitCode=0 Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.321860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" event={"ID":"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe","Type":"ContainerDied","Data":"e37a9ab49c4ce221b61837a42416bb6a3ef73d7b13c3090449465be7a843e5c1"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.327537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77db99878b-h8xzs" event={"ID":"6aa9efa8-e6b5-4307-89b1-8a67547a35e9","Type":"ContainerStarted","Data":"3d76c8c2dc46458f30983ff42dbe68736e517913cf816891d81a1564754ce66f"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.355837 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="27b00527a1a2dd19572cd4b34eeb62edfbb26a8ff621d8e0ad7b7a217cf69cd3" exitCode=0 Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.356195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"27b00527a1a2dd19572cd4b34eeb62edfbb26a8ff621d8e0ad7b7a217cf69cd3"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.356240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"3eb8d09ea3950a1c29c70e73d11ea5133c61c40a9512fdef46057924b3898430"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.356267 4707 scope.go:117] "RemoveContainer" containerID="9c49c62c491f92ccee8fba7d855c10e4bbd43134f1f4f54ad1885c2004d7c90c" Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.362613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerStarted","Data":"ba6be017b8075d81c56f6f49cf226d8481b69e5c48326636118cd0f0f5040bb4"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.367391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767657c548-8lcwd" event={"ID":"c861568d-c8d4-49f4-9e5d-524c9445b152","Type":"ContainerStarted","Data":"70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.367693 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.380338 4707 generic.go:334] "Generic (PLEG): container finished" podID="27b6874e-b7bf-4d36-8ea1-66cf492ba327" containerID="1ebbb7fc9bc31795cd57981b0878f89911fcc659b97fd197bce36c9a99dac95d" exitCode=0 Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.380440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4gf8v" event={"ID":"27b6874e-b7bf-4d36-8ea1-66cf492ba327","Type":"ContainerDied","Data":"1ebbb7fc9bc31795cd57981b0878f89911fcc659b97fd197bce36c9a99dac95d"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.384584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d57d6969-cr5bz" event={"ID":"eada398c-c6c5-4cbe-b221-1c3461fa8cd8","Type":"ContainerStarted","Data":"46b18c9a9a20ed28c742f389d1f7de58b7a33d5ca5ec209c8e73d3c052cd0a55"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.388922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584c97fdd8-f4pbz" event={"ID":"fe56b299-664f-478b-9f30-8e2a4c457676","Type":"ContainerStarted","Data":"29a0f7ee99a161c9430446c37f0a44e132007f8c079b432ace3e5f0964a9f2cc"} Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.433888 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-767657c548-8lcwd" podStartSLOduration=3.433867423 podStartE2EDuration="3.433867423s" podCreationTimestamp="2026-02-18 06:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:22.392177695 +0000 UTC m=+1119.040136829" watchObservedRunningTime="2026-02-18 06:06:22.433867423 +0000 UTC m=+1119.081826557" Feb 18 06:06:22 crc kubenswrapper[4707]: I0218 06:06:22.447896 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-584c97fdd8-f4pbz" podStartSLOduration=24.513499968 podStartE2EDuration="25.447869458s" podCreationTimestamp="2026-02-18 06:05:57 +0000 UTC" firstStartedPulling="2026-02-18 06:06:20.344593339 +0000 UTC m=+1116.992552473" lastFinishedPulling="2026-02-18 06:06:21.278962819 +0000 UTC m=+1117.926921963" observedRunningTime="2026-02-18 06:06:22.421648595 +0000 UTC m=+1119.069607749" watchObservedRunningTime="2026-02-18 06:06:22.447869458 +0000 UTC m=+1119.095828592" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.406213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-w2p4x" event={"ID":"0301370e-0d52-4549-93ba-033d6d706508","Type":"ContainerStarted","Data":"41d8068787160a33336b10393dce5cd5dbb2fb1f382dfb8518fa0426ef1569e8"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.418589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584c97fdd8-f4pbz" event={"ID":"fe56b299-664f-478b-9f30-8e2a4c457676","Type":"ContainerStarted","Data":"22c76c3f5497afbf94a962ad1ce73ddde6f0928e5b98bac214feb82c9ced4db8"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.423840 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-w2p4x" podStartSLOduration=3.93626173 podStartE2EDuration="36.423785482s" podCreationTimestamp="2026-02-18 06:05:47 +0000 UTC" firstStartedPulling="2026-02-18 06:05:50.108468579 +0000 UTC m=+1086.756427713" lastFinishedPulling="2026-02-18 06:06:22.595992331 +0000 UTC m=+1119.243951465" observedRunningTime="2026-02-18 06:06:23.418139581 +0000 UTC m=+1120.066098735" watchObservedRunningTime="2026-02-18 06:06:23.423785482 +0000 UTC m=+1120.071744616" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.426690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2","Type":"ContainerStarted","Data":"77e60f34e134a5e24802740f91dd468fd680904bcd33b2525e04274c39738250"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.447600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" event={"ID":"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe","Type":"ContainerStarted","Data":"b147222add7ebbbf08a4b2a17aeaed24640cd82bf7d862f1b394d19336761830"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.447643 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.455358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77db99878b-h8xzs" event={"ID":"6aa9efa8-e6b5-4307-89b1-8a67547a35e9","Type":"ContainerStarted","Data":"36c065eacc5ed64fef5f31a7a6cdc0521310e684824674faa9a8194d726b7fb4"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.481266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d57d6969-cr5bz" event={"ID":"eada398c-c6c5-4cbe-b221-1c3461fa8cd8","Type":"ContainerStarted","Data":"dd39673696c1c71348820e2f471078d3a0a78f235d0d501e6988b7a84c1af716"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.481321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d57d6969-cr5bz" event={"ID":"eada398c-c6c5-4cbe-b221-1c3461fa8cd8","Type":"ContainerStarted","Data":"51a6b225dae50b85cc67ae1341f9aa5bed51407265f983bd797477bf072586d4"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.481716 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.490073 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/0.log" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.490752 4707 generic.go:334] "Generic (PLEG): container finished" podID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerID="6fa9b42ca5359dd78e392e4f97ca9e377b7377efb58eb4e51aaa0857cf2f04fc" exitCode=1 Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.490945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767657c548-8lcwd" event={"ID":"c861568d-c8d4-49f4-9e5d-524c9445b152","Type":"ContainerDied","Data":"6fa9b42ca5359dd78e392e4f97ca9e377b7377efb58eb4e51aaa0857cf2f04fc"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.492069 4707 scope.go:117] "RemoveContainer" containerID="6fa9b42ca5359dd78e392e4f97ca9e377b7377efb58eb4e51aaa0857cf2f04fc" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.515664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9656e2a3-692a-44cb-9260-2b3ae4227e82","Type":"ContainerStarted","Data":"8ba5d40251905a9d76e41bc1b6a66cd86b068bdd7465c2eec88fc3f7a3f5597b"} Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.528061 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" podStartSLOduration=4.528044128 podStartE2EDuration="4.528044128s" podCreationTimestamp="2026-02-18 06:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:23.480639596 +0000 UTC m=+1120.128598730" watchObservedRunningTime="2026-02-18 06:06:23.528044128 +0000 UTC m=+1120.176003262" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.530405 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.530392661 podStartE2EDuration="17.530392661s" podCreationTimestamp="2026-02-18 06:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:23.459689265 +0000 UTC m=+1120.107648399" watchObservedRunningTime="2026-02-18 06:06:23.530392661 +0000 UTC m=+1120.178351795" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.571724 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77db99878b-h8xzs" podStartSLOduration=25.638268695 podStartE2EDuration="26.571699649s" podCreationTimestamp="2026-02-18 06:05:57 +0000 UTC" firstStartedPulling="2026-02-18 06:06:20.330917282 +0000 UTC m=+1116.978876416" lastFinishedPulling="2026-02-18 06:06:21.264348236 +0000 UTC m=+1117.912307370" observedRunningTime="2026-02-18 06:06:23.516221701 +0000 UTC m=+1120.164180835" watchObservedRunningTime="2026-02-18 06:06:23.571699649 +0000 UTC m=+1120.219658783" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.588095 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.588077738 podStartE2EDuration="17.588077738s" podCreationTimestamp="2026-02-18 06:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:23.545689292 +0000 UTC m=+1120.193648426" watchObservedRunningTime="2026-02-18 06:06:23.588077738 +0000 UTC m=+1120.236036872" Feb 18 06:06:23 crc kubenswrapper[4707]: I0218 06:06:23.647571 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59d57d6969-cr5bz" podStartSLOduration=2.647557144 podStartE2EDuration="2.647557144s" podCreationTimestamp="2026-02-18 06:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:23.644599604 +0000 UTC m=+1120.292558738" watchObservedRunningTime="2026-02-18 06:06:23.647557144 +0000 UTC m=+1120.295516278" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.099249 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4gf8v" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.208282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-scripts\") pod \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.208431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrcsl\" (UniqueName: \"kubernetes.io/projected/27b6874e-b7bf-4d36-8ea1-66cf492ba327-kube-api-access-zrcsl\") pod \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.208466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-combined-ca-bundle\") pod \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.208535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-config-data\") pod \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.208563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b6874e-b7bf-4d36-8ea1-66cf492ba327-logs\") pod \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\" (UID: \"27b6874e-b7bf-4d36-8ea1-66cf492ba327\") " Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.209185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27b6874e-b7bf-4d36-8ea1-66cf492ba327-logs" (OuterVolumeSpecName: "logs") pod "27b6874e-b7bf-4d36-8ea1-66cf492ba327" (UID: "27b6874e-b7bf-4d36-8ea1-66cf492ba327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.217372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-scripts" (OuterVolumeSpecName: "scripts") pod "27b6874e-b7bf-4d36-8ea1-66cf492ba327" (UID: "27b6874e-b7bf-4d36-8ea1-66cf492ba327"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.219440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b6874e-b7bf-4d36-8ea1-66cf492ba327-kube-api-access-zrcsl" (OuterVolumeSpecName: "kube-api-access-zrcsl") pod "27b6874e-b7bf-4d36-8ea1-66cf492ba327" (UID: "27b6874e-b7bf-4d36-8ea1-66cf492ba327"). InnerVolumeSpecName "kube-api-access-zrcsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.246833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-config-data" (OuterVolumeSpecName: "config-data") pod "27b6874e-b7bf-4d36-8ea1-66cf492ba327" (UID: "27b6874e-b7bf-4d36-8ea1-66cf492ba327"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.255837 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27b6874e-b7bf-4d36-8ea1-66cf492ba327" (UID: "27b6874e-b7bf-4d36-8ea1-66cf492ba327"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.310880 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.310918 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrcsl\" (UniqueName: \"kubernetes.io/projected/27b6874e-b7bf-4d36-8ea1-66cf492ba327-kube-api-access-zrcsl\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.310932 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.310941 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b6874e-b7bf-4d36-8ea1-66cf492ba327-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.310949 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b6874e-b7bf-4d36-8ea1-66cf492ba327-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.519642 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/1.log" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.521101 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/0.log" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.521603 4707 generic.go:334] "Generic (PLEG): container finished" podID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerID="40f9a7cfd85c3dd54411e77cee661183201be52245695b82633085d580003ce2" exitCode=1 Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.521732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767657c548-8lcwd" event={"ID":"c861568d-c8d4-49f4-9e5d-524c9445b152","Type":"ContainerDied","Data":"40f9a7cfd85c3dd54411e77cee661183201be52245695b82633085d580003ce2"} Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.521980 4707 scope.go:117] "RemoveContainer" containerID="6fa9b42ca5359dd78e392e4f97ca9e377b7377efb58eb4e51aaa0857cf2f04fc" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.522529 4707 scope.go:117] "RemoveContainer" containerID="40f9a7cfd85c3dd54411e77cee661183201be52245695b82633085d580003ce2" Feb 18 06:06:24 crc kubenswrapper[4707]: E0218 06:06:24.523099 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-767657c548-8lcwd_openstack(c861568d-c8d4-49f4-9e5d-524c9445b152)\"" pod="openstack/neutron-767657c548-8lcwd" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.527817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4gf8v" event={"ID":"27b6874e-b7bf-4d36-8ea1-66cf492ba327","Type":"ContainerDied","Data":"ae9c35ad76f7ed6fa22fa3ae8fef683c38e5901d174e50070616b6c380019cbe"} Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.527862 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae9c35ad76f7ed6fa22fa3ae8fef683c38e5901d174e50070616b6c380019cbe" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.528619 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4gf8v" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.622460 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f79f4d956-gdrpq"] Feb 18 06:06:24 crc kubenswrapper[4707]: E0218 06:06:24.622878 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b6874e-b7bf-4d36-8ea1-66cf492ba327" containerName="placement-db-sync" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.622894 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b6874e-b7bf-4d36-8ea1-66cf492ba327" containerName="placement-db-sync" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.623119 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b6874e-b7bf-4d36-8ea1-66cf492ba327" containerName="placement-db-sync" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.624768 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.632347 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.632830 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.632975 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6lwn9" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.646290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f79f4d956-gdrpq"] Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.685354 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.696019 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.731694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-public-tls-certs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.731743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-config-data\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.731762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-internal-tls-certs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.731832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-logs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.731877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq5kc\" (UniqueName: \"kubernetes.io/projected/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-kube-api-access-sq5kc\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.731901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-scripts\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.731929 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-combined-ca-bundle\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.834244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-config-data\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.834301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-internal-tls-certs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.834380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-logs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.834431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq5kc\" (UniqueName: \"kubernetes.io/projected/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-kube-api-access-sq5kc\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.834455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-scripts\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.834700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-combined-ca-bundle\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.834738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-public-tls-certs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.835892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-logs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.844772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-internal-tls-certs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.850224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-combined-ca-bundle\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.853733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-config-data\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.854345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-public-tls-certs\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.859925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq5kc\" (UniqueName: \"kubernetes.io/projected/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-kube-api-access-sq5kc\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:24 crc kubenswrapper[4707]: I0218 06:06:24.866326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-scripts\") pod \"placement-7f79f4d956-gdrpq\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:25 crc kubenswrapper[4707]: I0218 06:06:25.111734 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:25 crc kubenswrapper[4707]: I0218 06:06:25.551341 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2835aa4-2296-4af8-b099-af250380f599" containerID="5a7d6a439a0870e8147b64adeac25288771e67e2e5bbb4437da0e31d0cb51c02" exitCode=0 Feb 18 06:06:25 crc kubenswrapper[4707]: I0218 06:06:25.551496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5qczm" event={"ID":"a2835aa4-2296-4af8-b099-af250380f599","Type":"ContainerDied","Data":"5a7d6a439a0870e8147b64adeac25288771e67e2e5bbb4437da0e31d0cb51c02"} Feb 18 06:06:25 crc kubenswrapper[4707]: I0218 06:06:25.576701 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/1.log" Feb 18 06:06:25 crc kubenswrapper[4707]: I0218 06:06:25.579561 4707 scope.go:117] "RemoveContainer" containerID="40f9a7cfd85c3dd54411e77cee661183201be52245695b82633085d580003ce2" Feb 18 06:06:25 crc kubenswrapper[4707]: E0218 06:06:25.579938 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-767657c548-8lcwd_openstack(c861568d-c8d4-49f4-9e5d-524c9445b152)\"" pod="openstack/neutron-767657c548-8lcwd" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" Feb 18 06:06:25 crc kubenswrapper[4707]: I0218 06:06:25.693510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f79f4d956-gdrpq"] Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.596310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f79f4d956-gdrpq" event={"ID":"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f","Type":"ContainerStarted","Data":"40546d80fe707054739315f51cd066215cfbccad2c56e30729c28d87b3f2d906"} Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.788522 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.788709 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.806935 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.807329 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.846929 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.852279 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.858205 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:06:26 crc kubenswrapper[4707]: I0218 06:06:26.860639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.010470 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.187549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-scripts\") pod \"a2835aa4-2296-4af8-b099-af250380f599\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.187693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-combined-ca-bundle\") pod \"a2835aa4-2296-4af8-b099-af250380f599\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.188528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-credential-keys\") pod \"a2835aa4-2296-4af8-b099-af250380f599\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.188615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-fernet-keys\") pod \"a2835aa4-2296-4af8-b099-af250380f599\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.188646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-config-data\") pod \"a2835aa4-2296-4af8-b099-af250380f599\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.188696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98dpg\" (UniqueName: \"kubernetes.io/projected/a2835aa4-2296-4af8-b099-af250380f599-kube-api-access-98dpg\") pod \"a2835aa4-2296-4af8-b099-af250380f599\" (UID: \"a2835aa4-2296-4af8-b099-af250380f599\") " Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.193265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-scripts" (OuterVolumeSpecName: "scripts") pod "a2835aa4-2296-4af8-b099-af250380f599" (UID: "a2835aa4-2296-4af8-b099-af250380f599"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.194554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2835aa4-2296-4af8-b099-af250380f599-kube-api-access-98dpg" (OuterVolumeSpecName: "kube-api-access-98dpg") pod "a2835aa4-2296-4af8-b099-af250380f599" (UID: "a2835aa4-2296-4af8-b099-af250380f599"). InnerVolumeSpecName "kube-api-access-98dpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.195056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a2835aa4-2296-4af8-b099-af250380f599" (UID: "a2835aa4-2296-4af8-b099-af250380f599"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.222086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a2835aa4-2296-4af8-b099-af250380f599" (UID: "a2835aa4-2296-4af8-b099-af250380f599"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.229947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-config-data" (OuterVolumeSpecName: "config-data") pod "a2835aa4-2296-4af8-b099-af250380f599" (UID: "a2835aa4-2296-4af8-b099-af250380f599"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.245637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2835aa4-2296-4af8-b099-af250380f599" (UID: "a2835aa4-2296-4af8-b099-af250380f599"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.291598 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.291658 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.291679 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98dpg\" (UniqueName: \"kubernetes.io/projected/a2835aa4-2296-4af8-b099-af250380f599-kube-api-access-98dpg\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.291701 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.291716 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.291734 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2835aa4-2296-4af8-b099-af250380f599-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.441996 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.442156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.544148 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.544235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.622494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5qczm" event={"ID":"a2835aa4-2296-4af8-b099-af250380f599","Type":"ContainerDied","Data":"1ea02a917bc8b7d9117a01eb02dd8f7a1b49c57f6925a6c2d6b693919c3eb3b9"} Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.622535 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5qczm" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.622558 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea02a917bc8b7d9117a01eb02dd8f7a1b49c57f6925a6c2d6b693919c3eb3b9" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.623078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.623358 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.623387 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.623398 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.779838 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cc5d5b844-m7q6c"] Feb 18 06:06:27 crc kubenswrapper[4707]: E0218 06:06:27.780472 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2835aa4-2296-4af8-b099-af250380f599" containerName="keystone-bootstrap" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.780501 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2835aa4-2296-4af8-b099-af250380f599" containerName="keystone-bootstrap" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.780756 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2835aa4-2296-4af8-b099-af250380f599" containerName="keystone-bootstrap" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.781519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.785324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.785811 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.786014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.786400 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.787548 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lqdm4" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.791365 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.814774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc5d5b844-m7q6c"] Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2698s\" (UniqueName: \"kubernetes.io/projected/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-kube-api-access-2698s\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-combined-ca-bundle\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-public-tls-certs\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946808 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-scripts\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-fernet-keys\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-config-data\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-credential-keys\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:27 crc kubenswrapper[4707]: I0218 06:06:27.946940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-internal-tls-certs\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.050481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-config-data\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.050595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-credential-keys\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.050641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-internal-tls-certs\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.050726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2698s\" (UniqueName: \"kubernetes.io/projected/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-kube-api-access-2698s\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.050833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-combined-ca-bundle\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.050952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-public-tls-certs\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.051011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-scripts\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.051094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-fernet-keys\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.055716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-fernet-keys\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.056332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-combined-ca-bundle\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.057418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-config-data\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.058746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-internal-tls-certs\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.062021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-public-tls-certs\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.062565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-credential-keys\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.062950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-scripts\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.072408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2698s\" (UniqueName: \"kubernetes.io/projected/d9158ecc-f6e5-4c3f-a7e8-9195a34648b3-kube-api-access-2698s\") pod \"keystone-6cc5d5b844-m7q6c\" (UID: \"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3\") " pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.110900 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.505549 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-986f6fbf8-z89c7"] Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.507364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.528244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-986f6fbf8-z89c7"] Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.644152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f79f4d956-gdrpq" event={"ID":"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f","Type":"ContainerStarted","Data":"e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55"} Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.665116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt74d\" (UniqueName: \"kubernetes.io/projected/9d4c03f6-2a0f-460b-9b68-50838289b469-kube-api-access-pt74d\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.665175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-internal-tls-certs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.665211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-config-data\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.665245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4c03f6-2a0f-460b-9b68-50838289b469-logs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.665264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-scripts\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.665289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-public-tls-certs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.665313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-combined-ca-bundle\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.767206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt74d\" (UniqueName: \"kubernetes.io/projected/9d4c03f6-2a0f-460b-9b68-50838289b469-kube-api-access-pt74d\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.767556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-internal-tls-certs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.767597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-config-data\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.767636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4c03f6-2a0f-460b-9b68-50838289b469-logs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.767670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-scripts\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.767697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-public-tls-certs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.767730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-combined-ca-bundle\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.769233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4c03f6-2a0f-460b-9b68-50838289b469-logs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.777083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-config-data\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.777349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-scripts\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.777523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-internal-tls-certs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.778361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-public-tls-certs\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.780280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4c03f6-2a0f-460b-9b68-50838289b469-combined-ca-bundle\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.789010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt74d\" (UniqueName: \"kubernetes.io/projected/9d4c03f6-2a0f-460b-9b68-50838289b469-kube-api-access-pt74d\") pod \"placement-986f6fbf8-z89c7\" (UID: \"9d4c03f6-2a0f-460b-9b68-50838289b469\") " pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:28 crc kubenswrapper[4707]: I0218 06:06:28.874045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:29 crc kubenswrapper[4707]: I0218 06:06:29.432953 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:06:29 crc kubenswrapper[4707]: I0218 06:06:29.524393 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hdgrh"] Feb 18 06:06:29 crc kubenswrapper[4707]: I0218 06:06:29.524629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" podUID="74700413-a033-43e0-b01c-04c21e097135" containerName="dnsmasq-dns" containerID="cri-o://da811365dd7843c486e68f03e585fd831a31bf044b9039fb00616cd3ce326c08" gracePeriod=10 Feb 18 06:06:29 crc kubenswrapper[4707]: I0218 06:06:29.658701 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:06:29 crc kubenswrapper[4707]: I0218 06:06:29.658736 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:06:29 crc kubenswrapper[4707]: I0218 06:06:29.992678 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:06:30 crc kubenswrapper[4707]: I0218 06:06:30.091166 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:30 crc kubenswrapper[4707]: I0218 06:06:30.093183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:06:30 crc kubenswrapper[4707]: I0218 06:06:30.680216 4707 generic.go:334] "Generic (PLEG): container finished" podID="74700413-a033-43e0-b01c-04c21e097135" containerID="da811365dd7843c486e68f03e585fd831a31bf044b9039fb00616cd3ce326c08" exitCode=0 Feb 18 06:06:30 crc kubenswrapper[4707]: I0218 06:06:30.680338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" event={"ID":"74700413-a033-43e0-b01c-04c21e097135","Type":"ContainerDied","Data":"da811365dd7843c486e68f03e585fd831a31bf044b9039fb00616cd3ce326c08"} Feb 18 06:06:31 crc kubenswrapper[4707]: I0218 06:06:31.484618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.056766 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.344316 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.511398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-svc\") pod \"74700413-a033-43e0-b01c-04c21e097135\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.511878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-swift-storage-0\") pod \"74700413-a033-43e0-b01c-04c21e097135\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.511962 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-nb\") pod \"74700413-a033-43e0-b01c-04c21e097135\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.511983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgvl9\" (UniqueName: \"kubernetes.io/projected/74700413-a033-43e0-b01c-04c21e097135-kube-api-access-hgvl9\") pod \"74700413-a033-43e0-b01c-04c21e097135\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.512149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-config\") pod \"74700413-a033-43e0-b01c-04c21e097135\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.513008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-sb\") pod \"74700413-a033-43e0-b01c-04c21e097135\" (UID: \"74700413-a033-43e0-b01c-04c21e097135\") " Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.523103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74700413-a033-43e0-b01c-04c21e097135-kube-api-access-hgvl9" (OuterVolumeSpecName: "kube-api-access-hgvl9") pod "74700413-a033-43e0-b01c-04c21e097135" (UID: "74700413-a033-43e0-b01c-04c21e097135"). InnerVolumeSpecName "kube-api-access-hgvl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.615770 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgvl9\" (UniqueName: \"kubernetes.io/projected/74700413-a033-43e0-b01c-04c21e097135-kube-api-access-hgvl9\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.716851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74700413-a033-43e0-b01c-04c21e097135" (UID: "74700413-a033-43e0-b01c-04c21e097135"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.717997 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.722601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74700413-a033-43e0-b01c-04c21e097135" (UID: "74700413-a033-43e0-b01c-04c21e097135"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.723655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqnj" event={"ID":"3f8ab203-7caa-4df6-9c0c-599a9d1b9612","Type":"ContainerStarted","Data":"36085a9410c8c53d98b5847c5b00a965f89d979f3a53a99d89fba2d10f3760e8"} Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.725485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-config" (OuterVolumeSpecName: "config") pod "74700413-a033-43e0-b01c-04c21e097135" (UID: "74700413-a033-43e0-b01c-04c21e097135"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.729194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f79f4d956-gdrpq" event={"ID":"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f","Type":"ContainerStarted","Data":"6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3"} Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.731644 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.731687 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.733335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" event={"ID":"74700413-a033-43e0-b01c-04c21e097135","Type":"ContainerDied","Data":"dac8966104dc9881302c8a9a0af2681def4fd165b5a152838663a309a155b071"} Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.733372 4707 scope.go:117] "RemoveContainer" containerID="da811365dd7843c486e68f03e585fd831a31bf044b9039fb00616cd3ce326c08" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.733469 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hdgrh" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.739270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74700413-a033-43e0-b01c-04c21e097135" (UID: "74700413-a033-43e0-b01c-04c21e097135"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.740318 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-7f79f4d956-gdrpq" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.160:8778/\": dial tcp 10.217.0.160:8778: connect: connection refused" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.749264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74700413-a033-43e0-b01c-04c21e097135" (UID: "74700413-a033-43e0-b01c-04c21e097135"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.757674 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bpqnj" podStartSLOduration=3.3922970660000002 podStartE2EDuration="46.757648103s" podCreationTimestamp="2026-02-18 06:05:47 +0000 UTC" firstStartedPulling="2026-02-18 06:05:49.974520572 +0000 UTC m=+1086.622479706" lastFinishedPulling="2026-02-18 06:06:33.339871609 +0000 UTC m=+1129.987830743" observedRunningTime="2026-02-18 06:06:33.749452824 +0000 UTC m=+1130.397411948" watchObservedRunningTime="2026-02-18 06:06:33.757648103 +0000 UTC m=+1130.405607237" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.768971 4707 scope.go:117] "RemoveContainer" containerID="8001a3fee03d5ce3fdbefa5a329bdef9cc9e4d29cebf335e7307a31daeb7f800" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.802229 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f79f4d956-gdrpq" podStartSLOduration=9.802207588 podStartE2EDuration="9.802207588s" podCreationTimestamp="2026-02-18 06:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:33.779491919 +0000 UTC m=+1130.427451053" watchObservedRunningTime="2026-02-18 06:06:33.802207588 +0000 UTC m=+1130.450166722" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.823144 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.823174 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.823184 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.823192 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74700413-a033-43e0-b01c-04c21e097135-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.825180 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc5d5b844-m7q6c"] Feb 18 06:06:33 crc kubenswrapper[4707]: I0218 06:06:33.835833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-986f6fbf8-z89c7"] Feb 18 06:06:33 crc kubenswrapper[4707]: W0218 06:06:33.835932 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9158ecc_f6e5_4c3f_a7e8_9195a34648b3.slice/crio-1c01b83663b08cf317aa2f2604459a0716941644e09085747f883dc811fee05d WatchSource:0}: Error finding container 1c01b83663b08cf317aa2f2604459a0716941644e09085747f883dc811fee05d: Status 404 returned error can't find the container with id 1c01b83663b08cf317aa2f2604459a0716941644e09085747f883dc811fee05d Feb 18 06:06:34 crc kubenswrapper[4707]: I0218 06:06:34.168072 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hdgrh"] Feb 18 06:06:34 crc kubenswrapper[4707]: I0218 06:06:34.182698 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hdgrh"] Feb 18 06:06:34 crc kubenswrapper[4707]: I0218 06:06:34.993186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc5d5b844-m7q6c" event={"ID":"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3","Type":"ContainerStarted","Data":"161450f3028a9c3d463bf7c96c414d8ecaaba3e622b2eaa54f6772ae8344de59"} Feb 18 06:06:34 crc kubenswrapper[4707]: I0218 06:06:34.993764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc5d5b844-m7q6c" event={"ID":"d9158ecc-f6e5-4c3f-a7e8-9195a34648b3","Type":"ContainerStarted","Data":"1c01b83663b08cf317aa2f2604459a0716941644e09085747f883dc811fee05d"} Feb 18 06:06:34 crc kubenswrapper[4707]: I0218 06:06:34.993933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.007977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kl7b" event={"ID":"e5f975a7-ca3d-4940-9281-051360f67955","Type":"ContainerStarted","Data":"279f5535fb47a6f51575f5c80b1d90fab834c7b8ae4cc1cbc8af455923031057"} Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.014471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-986f6fbf8-z89c7" event={"ID":"9d4c03f6-2a0f-460b-9b68-50838289b469","Type":"ContainerStarted","Data":"4d7f9032319da8c79b9016094be1f0bbaea83e0bdfa14ca352606b947f915854"} Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.014542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-986f6fbf8-z89c7" event={"ID":"9d4c03f6-2a0f-460b-9b68-50838289b469","Type":"ContainerStarted","Data":"cc51c9b3fb10cd4dbd320072942ed31846dc6448fed622e66a3e339ad69489db"} Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.014598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-986f6fbf8-z89c7" event={"ID":"9d4c03f6-2a0f-460b-9b68-50838289b469","Type":"ContainerStarted","Data":"d0bf2a864ac6ff567253ebf61ba4ad020b169df9d2898eada5cf777329c357ce"} Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.014639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.014940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.031317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerStarted","Data":"25585314e4b53e676ff01370bfa90e26dce3f9c36989ea869fae981da6c662c9"} Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.041603 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6cc5d5b844-m7q6c" podStartSLOduration=8.041577537 podStartE2EDuration="8.041577537s" podCreationTimestamp="2026-02-18 06:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:35.022898357 +0000 UTC m=+1131.670857501" watchObservedRunningTime="2026-02-18 06:06:35.041577537 +0000 UTC m=+1131.689536671" Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.253537 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-986f6fbf8-z89c7" podStartSLOduration=7.253512236 podStartE2EDuration="7.253512236s" podCreationTimestamp="2026-02-18 06:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:35.053410425 +0000 UTC m=+1131.701369559" watchObservedRunningTime="2026-02-18 06:06:35.253512236 +0000 UTC m=+1131.901471370" Feb 18 06:06:35 crc kubenswrapper[4707]: I0218 06:06:35.260186 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4kl7b" podStartSLOduration=4.199457973 podStartE2EDuration="48.260169704s" podCreationTimestamp="2026-02-18 06:05:47 +0000 UTC" firstStartedPulling="2026-02-18 06:05:49.501989914 +0000 UTC m=+1086.149949048" lastFinishedPulling="2026-02-18 06:06:33.562701655 +0000 UTC m=+1130.210660779" observedRunningTime="2026-02-18 06:06:35.248312546 +0000 UTC m=+1131.896271680" watchObservedRunningTime="2026-02-18 06:06:35.260169704 +0000 UTC m=+1131.908128838" Feb 18 06:06:36 crc kubenswrapper[4707]: I0218 06:06:36.061820 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74700413-a033-43e0-b01c-04c21e097135" path="/var/lib/kubelet/pods/74700413-a033-43e0-b01c-04c21e097135/volumes" Feb 18 06:06:38 crc kubenswrapper[4707]: I0218 06:06:37.055510 4707 scope.go:117] "RemoveContainer" containerID="40f9a7cfd85c3dd54411e77cee661183201be52245695b82633085d580003ce2" Feb 18 06:06:39 crc kubenswrapper[4707]: I0218 06:06:39.034658 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" podUID="8ed2f5cf-84b8-4a09-b76f-a60bcb055a04" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:06:39 crc kubenswrapper[4707]: I0218 06:06:39.035882 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-77db99878b-h8xzs" podUID="6aa9efa8-e6b5-4307-89b1-8a67547a35e9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 18 06:06:39 crc kubenswrapper[4707]: I0218 06:06:39.036024 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-584c97fdd8-f4pbz" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 18 06:06:39 crc kubenswrapper[4707]: I0218 06:06:39.504321 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:40 crc kubenswrapper[4707]: I0218 06:06:40.142804 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/2.log" Feb 18 06:06:40 crc kubenswrapper[4707]: I0218 06:06:40.143978 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/1.log" Feb 18 06:06:40 crc kubenswrapper[4707]: I0218 06:06:40.144488 4707 generic.go:334] "Generic (PLEG): container finished" podID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerID="8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6" exitCode=1 Feb 18 06:06:40 crc kubenswrapper[4707]: I0218 06:06:40.144534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767657c548-8lcwd" event={"ID":"c861568d-c8d4-49f4-9e5d-524c9445b152","Type":"ContainerDied","Data":"8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6"} Feb 18 06:06:40 crc kubenswrapper[4707]: I0218 06:06:40.144580 4707 scope.go:117] "RemoveContainer" containerID="40f9a7cfd85c3dd54411e77cee661183201be52245695b82633085d580003ce2" Feb 18 06:06:40 crc kubenswrapper[4707]: I0218 06:06:40.146046 4707 scope.go:117] "RemoveContainer" containerID="8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6" Feb 18 06:06:40 crc kubenswrapper[4707]: E0218 06:06:40.146573 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-767657c548-8lcwd_openstack(c861568d-c8d4-49f4-9e5d-524c9445b152)\"" pod="openstack/neutron-767657c548-8lcwd" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" Feb 18 06:06:40 crc kubenswrapper[4707]: I0218 06:06:40.529779 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:06:41 crc kubenswrapper[4707]: I0218 06:06:41.159936 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/2.log" Feb 18 06:06:44 crc kubenswrapper[4707]: I0218 06:06:44.205115 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f8ab203-7caa-4df6-9c0c-599a9d1b9612" containerID="36085a9410c8c53d98b5847c5b00a965f89d979f3a53a99d89fba2d10f3760e8" exitCode=0 Feb 18 06:06:44 crc kubenswrapper[4707]: I0218 06:06:44.205371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqnj" event={"ID":"3f8ab203-7caa-4df6-9c0c-599a9d1b9612","Type":"ContainerDied","Data":"36085a9410c8c53d98b5847c5b00a965f89d979f3a53a99d89fba2d10f3760e8"} Feb 18 06:06:45 crc kubenswrapper[4707]: I0218 06:06:45.819424 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:06:45 crc kubenswrapper[4707]: I0218 06:06:45.908655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-db-sync-config-data\") pod \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " Feb 18 06:06:45 crc kubenswrapper[4707]: I0218 06:06:45.908958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-combined-ca-bundle\") pod \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " Feb 18 06:06:45 crc kubenswrapper[4707]: I0218 06:06:45.909069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz2kr\" (UniqueName: \"kubernetes.io/projected/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-kube-api-access-zz2kr\") pod \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\" (UID: \"3f8ab203-7caa-4df6-9c0c-599a9d1b9612\") " Feb 18 06:06:45 crc kubenswrapper[4707]: I0218 06:06:45.914773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3f8ab203-7caa-4df6-9c0c-599a9d1b9612" (UID: "3f8ab203-7caa-4df6-9c0c-599a9d1b9612"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:45 crc kubenswrapper[4707]: I0218 06:06:45.916232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-kube-api-access-zz2kr" (OuterVolumeSpecName: "kube-api-access-zz2kr") pod "3f8ab203-7caa-4df6-9c0c-599a9d1b9612" (UID: "3f8ab203-7caa-4df6-9c0c-599a9d1b9612"). InnerVolumeSpecName "kube-api-access-zz2kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:45 crc kubenswrapper[4707]: I0218 06:06:45.951661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f8ab203-7caa-4df6-9c0c-599a9d1b9612" (UID: "3f8ab203-7caa-4df6-9c0c-599a9d1b9612"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:45 crc kubenswrapper[4707]: E0218 06:06:45.953540 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.010611 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.010659 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz2kr\" (UniqueName: \"kubernetes.io/projected/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-kube-api-access-zz2kr\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.010673 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f8ab203-7caa-4df6-9c0c-599a9d1b9612-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.231324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerStarted","Data":"334205e1000ab08aadced1ad4efb96cad86cdc4b0b6b52ddf4f6aa24fe3d6163"} Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.231625 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.231710 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="proxy-httpd" containerID="cri-o://334205e1000ab08aadced1ad4efb96cad86cdc4b0b6b52ddf4f6aa24fe3d6163" gracePeriod=30 Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.231769 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="sg-core" containerID="cri-o://25585314e4b53e676ff01370bfa90e26dce3f9c36989ea869fae981da6c662c9" gracePeriod=30 Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.231706 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="ceilometer-notification-agent" containerID="cri-o://ba6be017b8075d81c56f6f49cf226d8481b69e5c48326636118cd0f0f5040bb4" gracePeriod=30 Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.241214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqnj" event={"ID":"3f8ab203-7caa-4df6-9c0c-599a9d1b9612","Type":"ContainerDied","Data":"156ed10aff4de870ab0172c570e6f3c7a7c4520bb38a1221165bfe2af07d1e43"} Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.241284 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="156ed10aff4de870ab0172c570e6f3c7a7c4520bb38a1221165bfe2af07d1e43" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.241382 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqnj" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.245772 4707 generic.go:334] "Generic (PLEG): container finished" podID="0301370e-0d52-4549-93ba-033d6d706508" containerID="41d8068787160a33336b10393dce5cd5dbb2fb1f382dfb8518fa0426ef1569e8" exitCode=0 Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.245870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-w2p4x" event={"ID":"0301370e-0d52-4549-93ba-033d6d706508","Type":"ContainerDied","Data":"41d8068787160a33336b10393dce5cd5dbb2fb1f382dfb8518fa0426ef1569e8"} Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.258036 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5f975a7-ca3d-4940-9281-051360f67955" containerID="279f5535fb47a6f51575f5c80b1d90fab834c7b8ae4cc1cbc8af455923031057" exitCode=0 Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.258082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kl7b" event={"ID":"e5f975a7-ca3d-4940-9281-051360f67955","Type":"ContainerDied","Data":"279f5535fb47a6f51575f5c80b1d90fab834c7b8ae4cc1cbc8af455923031057"} Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.595156 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hx5fl"] Feb 18 06:06:46 crc kubenswrapper[4707]: E0218 06:06:46.596068 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8ab203-7caa-4df6-9c0c-599a9d1b9612" containerName="barbican-db-sync" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.596089 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8ab203-7caa-4df6-9c0c-599a9d1b9612" containerName="barbican-db-sync" Feb 18 06:06:46 crc kubenswrapper[4707]: E0218 06:06:46.596134 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74700413-a033-43e0-b01c-04c21e097135" containerName="init" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.596142 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74700413-a033-43e0-b01c-04c21e097135" containerName="init" Feb 18 06:06:46 crc kubenswrapper[4707]: E0218 06:06:46.596155 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74700413-a033-43e0-b01c-04c21e097135" containerName="dnsmasq-dns" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.596164 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74700413-a033-43e0-b01c-04c21e097135" containerName="dnsmasq-dns" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.596379 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="74700413-a033-43e0-b01c-04c21e097135" containerName="dnsmasq-dns" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.596404 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8ab203-7caa-4df6-9c0c-599a9d1b9612" containerName="barbican-db-sync" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.597708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.632872 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-747894f44d-cbjjh"] Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.634272 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.639480 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.644596 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sqgvm" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.644690 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.657762 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hx5fl"] Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.662141 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-667ddf5c59-6bpbq"] Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.663705 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.666324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.679242 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-747894f44d-cbjjh"] Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.687986 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-667ddf5c59-6bpbq"] Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-config-data-custom\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-config-data\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7xps\" (UniqueName: \"kubernetes.io/projected/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-kube-api-access-q7xps\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-config-data\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-combined-ca-bundle\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16501ed-460b-4a5c-8d59-9acddd5e1011-logs\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-config-data-custom\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-combined-ca-bundle\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp5wp\" (UniqueName: \"kubernetes.io/projected/06b50daa-d6b3-4865-b224-516392956313-kube-api-access-fp5wp\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b50daa-d6b3-4865-b224-516392956313-logs\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-config\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.729474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrxqq\" (UniqueName: \"kubernetes.io/projected/b16501ed-460b-4a5c-8d59-9acddd5e1011-kube-api-access-xrxqq\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-config-data-custom\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-combined-ca-bundle\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp5wp\" (UniqueName: \"kubernetes.io/projected/06b50daa-d6b3-4865-b224-516392956313-kube-api-access-fp5wp\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b50daa-d6b3-4865-b224-516392956313-logs\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-config\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrxqq\" (UniqueName: \"kubernetes.io/projected/b16501ed-460b-4a5c-8d59-9acddd5e1011-kube-api-access-xrxqq\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-config-data-custom\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-config-data\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7xps\" (UniqueName: \"kubernetes.io/projected/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-kube-api-access-q7xps\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-config-data\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-combined-ca-bundle\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16501ed-460b-4a5c-8d59-9acddd5e1011-logs\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.832673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.833696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.836219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.836904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.837762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16501ed-460b-4a5c-8d59-9acddd5e1011-logs\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.838101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b50daa-d6b3-4865-b224-516392956313-logs\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.842039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-config-data-custom\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.842267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-combined-ca-bundle\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.842668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-config\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.843561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-config-data-custom\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.846285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b50daa-d6b3-4865-b224-516392956313-config-data\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.848294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.856448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-config-data\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.863142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16501ed-460b-4a5c-8d59-9acddd5e1011-combined-ca-bundle\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.874293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrxqq\" (UniqueName: \"kubernetes.io/projected/b16501ed-460b-4a5c-8d59-9acddd5e1011-kube-api-access-xrxqq\") pod \"barbican-keystone-listener-747894f44d-cbjjh\" (UID: \"b16501ed-460b-4a5c-8d59-9acddd5e1011\") " pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.875494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp5wp\" (UniqueName: \"kubernetes.io/projected/06b50daa-d6b3-4865-b224-516392956313-kube-api-access-fp5wp\") pod \"barbican-worker-667ddf5c59-6bpbq\" (UID: \"06b50daa-d6b3-4865-b224-516392956313\") " pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.879789 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-654b57f754-5bkfb"] Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.879925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7xps\" (UniqueName: \"kubernetes.io/projected/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-kube-api-access-q7xps\") pod \"dnsmasq-dns-688c87cc99-hx5fl\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.882026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.886157 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.897828 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-654b57f754-5bkfb"] Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.948210 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.966961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" Feb 18 06:06:46 crc kubenswrapper[4707]: I0218 06:06:46.984031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-667ddf5c59-6bpbq" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.039251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872558df-6201-4e66-9c41-d05a120eec8d-logs\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.039380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-combined-ca-bundle\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.039403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data-custom\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.039481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.039504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmp8w\" (UniqueName: \"kubernetes.io/projected/872558df-6201-4e66-9c41-d05a120eec8d-kube-api-access-hmp8w\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.143971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-combined-ca-bundle\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.144563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data-custom\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.144701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.144736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmp8w\" (UniqueName: \"kubernetes.io/projected/872558df-6201-4e66-9c41-d05a120eec8d-kube-api-access-hmp8w\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.144812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872558df-6201-4e66-9c41-d05a120eec8d-logs\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.146102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872558df-6201-4e66-9c41-d05a120eec8d-logs\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.150587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-combined-ca-bundle\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.151008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data-custom\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.151096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.167260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmp8w\" (UniqueName: \"kubernetes.io/projected/872558df-6201-4e66-9c41-d05a120eec8d-kube-api-access-hmp8w\") pod \"barbican-api-654b57f754-5bkfb\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.271730 4707 generic.go:334] "Generic (PLEG): container finished" podID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerID="334205e1000ab08aadced1ad4efb96cad86cdc4b0b6b52ddf4f6aa24fe3d6163" exitCode=0 Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.271777 4707 generic.go:334] "Generic (PLEG): container finished" podID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerID="25585314e4b53e676ff01370bfa90e26dce3f9c36989ea869fae981da6c662c9" exitCode=2 Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.271986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerDied","Data":"334205e1000ab08aadced1ad4efb96cad86cdc4b0b6b52ddf4f6aa24fe3d6163"} Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.272060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerDied","Data":"25585314e4b53e676ff01370bfa90e26dce3f9c36989ea869fae981da6c662c9"} Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.366442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.443026 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-584c97fdd8-f4pbz" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.505974 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-747894f44d-cbjjh"] Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.806945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hx5fl"] Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.834617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-667ddf5c59-6bpbq"] Feb 18 06:06:47 crc kubenswrapper[4707]: I0218 06:06:47.957210 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-654b57f754-5bkfb"] Feb 18 06:06:47 crc kubenswrapper[4707]: W0218 06:06:47.970386 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod872558df_6201_4e66_9c41_d05a120eec8d.slice/crio-21e96fb18f5bd6ec3e23be5651c54d38dd60d745973387647b48b71c83054e5b WatchSource:0}: Error finding container 21e96fb18f5bd6ec3e23be5651c54d38dd60d745973387647b48b71c83054e5b: Status 404 returned error can't find the container with id 21e96fb18f5bd6ec3e23be5651c54d38dd60d745973387647b48b71c83054e5b Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.039880 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.054819 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-w2p4x" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.175894 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-config-data\") pod \"0301370e-0d52-4549-93ba-033d6d706508\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-job-config-data\") pod \"0301370e-0d52-4549-93ba-033d6d706508\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-db-sync-config-data\") pod \"e5f975a7-ca3d-4940-9281-051360f67955\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4267g\" (UniqueName: \"kubernetes.io/projected/e5f975a7-ca3d-4940-9281-051360f67955-kube-api-access-4267g\") pod \"e5f975a7-ca3d-4940-9281-051360f67955\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-combined-ca-bundle\") pod \"0301370e-0d52-4549-93ba-033d6d706508\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-combined-ca-bundle\") pod \"e5f975a7-ca3d-4940-9281-051360f67955\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-config-data\") pod \"e5f975a7-ca3d-4940-9281-051360f67955\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f975a7-ca3d-4940-9281-051360f67955-etc-machine-id\") pod \"e5f975a7-ca3d-4940-9281-051360f67955\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrw4\" (UniqueName: \"kubernetes.io/projected/0301370e-0d52-4549-93ba-033d6d706508-kube-api-access-lbrw4\") pod \"0301370e-0d52-4549-93ba-033d6d706508\" (UID: \"0301370e-0d52-4549-93ba-033d6d706508\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.176661 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-scripts\") pod \"e5f975a7-ca3d-4940-9281-051360f67955\" (UID: \"e5f975a7-ca3d-4940-9281-051360f67955\") " Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.177674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5f975a7-ca3d-4940-9281-051360f67955-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e5f975a7-ca3d-4940-9281-051360f67955" (UID: "e5f975a7-ca3d-4940-9281-051360f67955"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.185032 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f975a7-ca3d-4940-9281-051360f67955-kube-api-access-4267g" (OuterVolumeSpecName: "kube-api-access-4267g") pod "e5f975a7-ca3d-4940-9281-051360f67955" (UID: "e5f975a7-ca3d-4940-9281-051360f67955"). InnerVolumeSpecName "kube-api-access-4267g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.186126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e5f975a7-ca3d-4940-9281-051360f67955" (UID: "e5f975a7-ca3d-4940-9281-051360f67955"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.186977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-scripts" (OuterVolumeSpecName: "scripts") pod "e5f975a7-ca3d-4940-9281-051360f67955" (UID: "e5f975a7-ca3d-4940-9281-051360f67955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.189407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-config-data" (OuterVolumeSpecName: "config-data") pod "0301370e-0d52-4549-93ba-033d6d706508" (UID: "0301370e-0d52-4549-93ba-033d6d706508"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.189997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "0301370e-0d52-4549-93ba-033d6d706508" (UID: "0301370e-0d52-4549-93ba-033d6d706508"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.198970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0301370e-0d52-4549-93ba-033d6d706508-kube-api-access-lbrw4" (OuterVolumeSpecName: "kube-api-access-lbrw4") pod "0301370e-0d52-4549-93ba-033d6d706508" (UID: "0301370e-0d52-4549-93ba-033d6d706508"). InnerVolumeSpecName "kube-api-access-lbrw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.221629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0301370e-0d52-4549-93ba-033d6d706508" (UID: "0301370e-0d52-4549-93ba-033d6d706508"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.233888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f975a7-ca3d-4940-9281-051360f67955" (UID: "e5f975a7-ca3d-4940-9281-051360f67955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.262623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-config-data" (OuterVolumeSpecName: "config-data") pod "e5f975a7-ca3d-4940-9281-051360f67955" (UID: "e5f975a7-ca3d-4940-9281-051360f67955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278882 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278923 4707 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278936 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278947 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4267g\" (UniqueName: \"kubernetes.io/projected/e5f975a7-ca3d-4940-9281-051360f67955-kube-api-access-4267g\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278957 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301370e-0d52-4549-93ba-033d6d706508-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278965 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278973 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278981 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f975a7-ca3d-4940-9281-051360f67955-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278989 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrw4\" (UniqueName: \"kubernetes.io/projected/0301370e-0d52-4549-93ba-033d6d706508-kube-api-access-lbrw4\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.278997 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f975a7-ca3d-4940-9281-051360f67955-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.291353 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4kl7b" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.291858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4kl7b" event={"ID":"e5f975a7-ca3d-4940-9281-051360f67955","Type":"ContainerDied","Data":"ea3c5044cbd6eac9115a2e2869ae64f1f5a389c412af93229e95fe17dddbe77f"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.292060 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea3c5044cbd6eac9115a2e2869ae64f1f5a389c412af93229e95fe17dddbe77f" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.297720 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerID="10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d" exitCode=0 Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.297836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" event={"ID":"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4","Type":"ContainerDied","Data":"10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.297876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" event={"ID":"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4","Type":"ContainerStarted","Data":"cc352063b09fcf652e24094ebb17ec2cad5e785fda46f792af5f88e7d4e7403f"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.301560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-654b57f754-5bkfb" event={"ID":"872558df-6201-4e66-9c41-d05a120eec8d","Type":"ContainerStarted","Data":"dcb0a29044cf4492d5e71084b21ad2afdd6d4ea21ff51a564c3b93947cae38ab"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.301634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-654b57f754-5bkfb" event={"ID":"872558df-6201-4e66-9c41-d05a120eec8d","Type":"ContainerStarted","Data":"21e96fb18f5bd6ec3e23be5651c54d38dd60d745973387647b48b71c83054e5b"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.307003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-667ddf5c59-6bpbq" event={"ID":"06b50daa-d6b3-4865-b224-516392956313","Type":"ContainerStarted","Data":"084c5211f9bae7eaa65061ab4b1d555dfcff8939df8c00b80b89bb46bcfe9c8b"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.310584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-w2p4x" event={"ID":"0301370e-0d52-4549-93ba-033d6d706508","Type":"ContainerDied","Data":"74847e87daa752b0d5e875a6a5a25511a8e43aef7fad69ac2f64536344e00109"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.310667 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74847e87daa752b0d5e875a6a5a25511a8e43aef7fad69ac2f64536344e00109" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.310826 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-w2p4x" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.313777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" event={"ID":"b16501ed-460b-4a5c-8d59-9acddd5e1011","Type":"ContainerStarted","Data":"49039d4252cc11e14a00107a79b87dfa0d87e67406d474c263c7089645ed7c15"} Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.682520 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:06:48 crc kubenswrapper[4707]: E0218 06:06:48.683718 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f975a7-ca3d-4940-9281-051360f67955" containerName="cinder-db-sync" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.683732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f975a7-ca3d-4940-9281-051360f67955" containerName="cinder-db-sync" Feb 18 06:06:48 crc kubenswrapper[4707]: E0218 06:06:48.683743 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0301370e-0d52-4549-93ba-033d6d706508" containerName="manila-db-sync" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.683750 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0301370e-0d52-4549-93ba-033d6d706508" containerName="manila-db-sync" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.683948 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0301370e-0d52-4549-93ba-033d6d706508" containerName="manila-db-sync" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.683972 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f975a7-ca3d-4940-9281-051360f67955" containerName="cinder-db-sync" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.684933 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.690259 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.690416 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-mtxr9" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.690515 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.695172 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.722567 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.787312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxvl\" (UniqueName: \"kubernetes.io/projected/4976b217-8ee1-4ef9-9ee8-93101252adcb-kube-api-access-9pxvl\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.787423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.787452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4976b217-8ee1-4ef9-9ee8-93101252adcb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.787480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-scripts\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.787536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.787582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.787669 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.789336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.797483 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.810872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hx5fl"] Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.844577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.858837 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.877781 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.883590 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.883767 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.883885 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nqck2" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.883988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4976b217-8ee1-4ef9-9ee8-93101252adcb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-scripts\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-scripts\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-ceph\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcd94\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-kube-api-access-hcd94\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.894474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pxvl\" (UniqueName: \"kubernetes.io/projected/4976b217-8ee1-4ef9-9ee8-93101252adcb-kube-api-access-9pxvl\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.897431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4976b217-8ee1-4ef9-9ee8-93101252adcb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.898387 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.913547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.932419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.933464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pxvl\" (UniqueName: \"kubernetes.io/projected/4976b217-8ee1-4ef9-9ee8-93101252adcb-kube-api-access-9pxvl\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.944972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-scripts\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.964112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " pod="openstack/manila-scheduler-0" Feb 18 06:06:48 crc kubenswrapper[4707]: I0218 06:06:48.998947 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f5698b9dc-6jxrc"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.000697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.002446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.002493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7w5d\" (UniqueName: \"kubernetes.io/projected/2760b874-f860-4b57-9cc3-91c3effda0cc-kube-api-access-z7w5d\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.002520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.002574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.006898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.006979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2760b874-f860-4b57-9cc3-91c3effda0cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.007038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-scripts\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.007082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-ceph\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.007109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.007177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.007465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5698b9dc-6jxrc"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.011976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcd94\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-kube-api-access-hcd94\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.027223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.027316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.027440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.030057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.036310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-scripts\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.050309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.051038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.071238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.080102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.080187 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.093684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.095632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcd94\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-kube-api-access-hcd94\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.110182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-ceph\") pod \"manila-share-share1-0\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.113044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.128710 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.130964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsg7c\" (UniqueName: \"kubernetes.io/projected/bea06ae2-b32c-42fe-9448-4eeb04a48947-kube-api-access-vsg7c\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-config\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7w5d\" (UniqueName: \"kubernetes.io/projected/2760b874-f860-4b57-9cc3-91c3effda0cc-kube-api-access-z7w5d\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-svc\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-run\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.131765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2760b874-f860-4b57-9cc3-91c3effda0cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-swift-storage-0\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.135661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r44k2\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-kube-api-access-r44k2\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.141604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2760b874-f860-4b57-9cc3-91c3effda0cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.143675 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.146100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.146133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.148856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.164094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.193691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7w5d\" (UniqueName: \"kubernetes.io/projected/2760b874-f860-4b57-9cc3-91c3effda0cc-kube-api-access-z7w5d\") pod \"cinder-scheduler-0\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.232104 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.273875 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.277960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-swift-storage-0\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r44k2\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-kube-api-access-r44k2\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsg7c\" (UniqueName: \"kubernetes.io/projected/bea06ae2-b32c-42fe-9448-4eeb04a48947-kube-api-access-vsg7c\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-config\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-svc\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-run\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.278677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.280056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.280230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.283147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.284172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.284952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.285015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.285401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.285508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-run\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.285610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.288293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.288973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-svc\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.289031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.294007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.294412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.300489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.300541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.301813 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-swift-storage-0\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.305550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.308634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.311109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-config\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.333896 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.340236 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.355966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.371070 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5698b9dc-6jxrc"] Feb 18 06:06:49 crc kubenswrapper[4707]: E0218 06:06:49.371971 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-vsg7c], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" podUID="bea06ae2-b32c-42fe-9448-4eeb04a48947" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.384493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r44k2\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-kube-api-access-r44k2\") pod \"cinder-volume-volume1-0\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.393062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-654b57f754-5bkfb" event={"ID":"872558df-6201-4e66-9c41-d05a120eec8d","Type":"ContainerStarted","Data":"457c33b156bd0dec7a5bfd7628485b6c0048c0424b4df582de25aa30aee61fe7"} Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.393448 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.393481 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.434869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsg7c\" (UniqueName: \"kubernetes.io/projected/bea06ae2-b32c-42fe-9448-4eeb04a48947-kube-api-access-vsg7c\") pod \"dnsmasq-dns-7f5698b9dc-6jxrc\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.436493 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9dcc9fdf5-4rnck"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.438110 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.441398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.508906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.508979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-run\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-dev\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-scripts\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-lib-modules\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnf8\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-kube-api-access-ttnf8\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-sys\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-ceph\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.509345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.530254 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.530957 4707 scope.go:117] "RemoveContainer" containerID="8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6" Feb 18 06:06:49 crc kubenswrapper[4707]: E0218 06:06:49.531136 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-767657c548-8lcwd_openstack(c861568d-c8d4-49f4-9e5d-524c9445b152)\"" pod="openstack/neutron-767657c548-8lcwd" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.531585 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.539267 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.545037 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dcc9fdf5-4rnck"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.553935 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-767657c548-8lcwd" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-api" probeResult="failure" output="Get \"http://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.557885 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.559455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.580945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.582737 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.614878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4l6f\" (UniqueName: \"kubernetes.io/projected/3621d2e8-f3b4-41d6-9386-e151527d23d3-kube-api-access-f4l6f\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.614941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-dev\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.614966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-config\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.614987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-svc\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-scripts\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-logs\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-lib-modules\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hknx\" (UniqueName: \"kubernetes.io/projected/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-kube-api-access-6hknx\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnf8\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-kube-api-access-ttnf8\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-sys\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-ceph\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-etc-machine-id\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-swift-storage-0\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-run\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-scripts\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data-custom\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.615858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-dev\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.623195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-sys\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.625125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.625237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-lib-modules\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.625293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-run\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.625504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.627991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.628069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.631549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.638685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.644187 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-654b57f754-5bkfb" podStartSLOduration=3.644165665 podStartE2EDuration="3.644165665s" podCreationTimestamp="2026-02-18 06:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:49.47135353 +0000 UTC m=+1146.119312664" watchObservedRunningTime="2026-02-18 06:06:49.644165665 +0000 UTC m=+1146.292124799" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.644408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-scripts\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.644842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-ceph\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.647559 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.664785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.709592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnf8\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-kube-api-access-ttnf8\") pod \"cinder-backup-0\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-etc-machine-id\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-swift-storage-0\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-scripts\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data-custom\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4l6f\" (UniqueName: \"kubernetes.io/projected/3621d2e8-f3b4-41d6-9386-e151527d23d3-kube-api-access-f4l6f\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-config\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-svc\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-logs\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.722861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hknx\" (UniqueName: \"kubernetes.io/projected/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-kube-api-access-6hknx\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.723199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-etc-machine-id\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.724008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-swift-storage-0\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.725488 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.725549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-logs\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.726141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.726230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-svc\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.727213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.727327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.727614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-config\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.734581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.744589 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.749289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-scripts\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.750974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.756956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4l6f\" (UniqueName: \"kubernetes.io/projected/3621d2e8-f3b4-41d6-9386-e151527d23d3-kube-api-access-f4l6f\") pod \"dnsmasq-dns-9dcc9fdf5-4rnck\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.768042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hknx\" (UniqueName: \"kubernetes.io/projected/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-kube-api-access-6hknx\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.773131 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.774218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data-custom\") pod \"manila-api-0\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.802238 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.827027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-scripts\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.827076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.827345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdkbh\" (UniqueName: \"kubernetes.io/projected/88ae39c5-0688-4fc3-8570-9069549bf56c-kube-api-access-vdkbh\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.827560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88ae39c5-0688-4fc3-8570-9069549bf56c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.827594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88ae39c5-0688-4fc3-8570-9069549bf56c-logs\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.827705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.828067 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data-custom\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.901947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.930697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-scripts\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.930765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.930870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdkbh\" (UniqueName: \"kubernetes.io/projected/88ae39c5-0688-4fc3-8570-9069549bf56c-kube-api-access-vdkbh\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.930958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88ae39c5-0688-4fc3-8570-9069549bf56c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.930988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88ae39c5-0688-4fc3-8570-9069549bf56c-logs\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.931025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.931135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data-custom\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.931657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88ae39c5-0688-4fc3-8570-9069549bf56c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.932546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88ae39c5-0688-4fc3-8570-9069549bf56c-logs\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.936180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data-custom\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.939024 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-scripts\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.939342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.940735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.941499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 18 06:06:49 crc kubenswrapper[4707]: I0218 06:06:49.956649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdkbh\" (UniqueName: \"kubernetes.io/projected/88ae39c5-0688-4fc3-8570-9069549bf56c-kube-api-access-vdkbh\") pod \"cinder-api-0\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " pod="openstack/cinder-api-0" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.154181 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.403592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" event={"ID":"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4","Type":"ContainerStarted","Data":"8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4"} Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.403686 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.403878 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" podUID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerName="dnsmasq-dns" containerID="cri-o://8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4" gracePeriod=10 Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.404274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.405263 4707 scope.go:117] "RemoveContainer" containerID="8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6" Feb 18 06:06:50 crc kubenswrapper[4707]: E0218 06:06:50.405569 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-767657c548-8lcwd_openstack(c861568d-c8d4-49f4-9e5d-524c9445b152)\"" pod="openstack/neutron-767657c548-8lcwd" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.426544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.544850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-sb\") pod \"bea06ae2-b32c-42fe-9448-4eeb04a48947\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.545247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bea06ae2-b32c-42fe-9448-4eeb04a48947" (UID: "bea06ae2-b32c-42fe-9448-4eeb04a48947"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.545400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-nb\") pod \"bea06ae2-b32c-42fe-9448-4eeb04a48947\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.545780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bea06ae2-b32c-42fe-9448-4eeb04a48947" (UID: "bea06ae2-b32c-42fe-9448-4eeb04a48947"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.545853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-svc\") pod \"bea06ae2-b32c-42fe-9448-4eeb04a48947\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.546191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bea06ae2-b32c-42fe-9448-4eeb04a48947" (UID: "bea06ae2-b32c-42fe-9448-4eeb04a48947"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.546251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsg7c\" (UniqueName: \"kubernetes.io/projected/bea06ae2-b32c-42fe-9448-4eeb04a48947-kube-api-access-vsg7c\") pod \"bea06ae2-b32c-42fe-9448-4eeb04a48947\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.546763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bea06ae2-b32c-42fe-9448-4eeb04a48947" (UID: "bea06ae2-b32c-42fe-9448-4eeb04a48947"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.546891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-swift-storage-0\") pod \"bea06ae2-b32c-42fe-9448-4eeb04a48947\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.547008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-config\") pod \"bea06ae2-b32c-42fe-9448-4eeb04a48947\" (UID: \"bea06ae2-b32c-42fe-9448-4eeb04a48947\") " Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.547517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-config" (OuterVolumeSpecName: "config") pod "bea06ae2-b32c-42fe-9448-4eeb04a48947" (UID: "bea06ae2-b32c-42fe-9448-4eeb04a48947"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.547621 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.547636 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.547647 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.547656 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.552976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea06ae2-b32c-42fe-9448-4eeb04a48947-kube-api-access-vsg7c" (OuterVolumeSpecName: "kube-api-access-vsg7c") pod "bea06ae2-b32c-42fe-9448-4eeb04a48947" (UID: "bea06ae2-b32c-42fe-9448-4eeb04a48947"). InnerVolumeSpecName "kube-api-access-vsg7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.650506 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsg7c\" (UniqueName: \"kubernetes.io/projected/bea06ae2-b32c-42fe-9448-4eeb04a48947-kube-api-access-vsg7c\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.651056 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea06ae2-b32c-42fe-9448-4eeb04a48947-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:50 crc kubenswrapper[4707]: I0218 06:06:50.975741 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.031722 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" podStartSLOduration=5.031683528 podStartE2EDuration="5.031683528s" podCreationTimestamp="2026-02-18 06:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:50.431374198 +0000 UTC m=+1147.079333332" watchObservedRunningTime="2026-02-18 06:06:51.031683528 +0000 UTC m=+1147.679642672" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.128021 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.165978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-nb\") pod \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.166022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-svc\") pod \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.166054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-config\") pod \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.166089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-sb\") pod \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.166151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7xps\" (UniqueName: \"kubernetes.io/projected/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-kube-api-access-q7xps\") pod \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.166209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-swift-storage-0\") pod \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\" (UID: \"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4\") " Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.178974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-kube-api-access-q7xps" (OuterVolumeSpecName: "kube-api-access-q7xps") pod "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" (UID: "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4"). InnerVolumeSpecName "kube-api-access-q7xps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.229358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" (UID: "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.270938 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.270993 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7xps\" (UniqueName: \"kubernetes.io/projected/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-kube-api-access-q7xps\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.274765 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" (UID: "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.306350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" (UID: "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.313559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-config" (OuterVolumeSpecName: "config") pod "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" (UID: "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.330424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" (UID: "b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.374889 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.374945 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.374958 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.374970 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.423087 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.424482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" event={"ID":"b16501ed-460b-4a5c-8d59-9acddd5e1011","Type":"ContainerStarted","Data":"794c9e6c35775760c4f55102e15fe38622b68abea029f8046609c94f1fa54e91"} Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.435390 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerID="8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4" exitCode=0 Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.435438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.435461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" event={"ID":"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4","Type":"ContainerDied","Data":"8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4"} Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.436209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hx5fl" event={"ID":"b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4","Type":"ContainerDied","Data":"cc352063b09fcf652e24094ebb17ec2cad5e785fda46f792af5f88e7d4e7403f"} Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.436239 4707 scope.go:117] "RemoveContainer" containerID="8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.444771 4707 generic.go:334] "Generic (PLEG): container finished" podID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerID="ba6be017b8075d81c56f6f49cf226d8481b69e5c48326636118cd0f0f5040bb4" exitCode=0 Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.444928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerDied","Data":"ba6be017b8075d81c56f6f49cf226d8481b69e5c48326636118cd0f0f5040bb4"} Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.447246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5698b9dc-6jxrc" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.449017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-667ddf5c59-6bpbq" event={"ID":"06b50daa-d6b3-4865-b224-516392956313","Type":"ContainerStarted","Data":"ffd474f937f28a64421a277815961f09538722688b61434b07c19b685e5c04dd"} Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.449057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-667ddf5c59-6bpbq" event={"ID":"06b50daa-d6b3-4865-b224-516392956313","Type":"ContainerStarted","Data":"315e44dd37a3379e035bb7343de6ea234eb53bbbf1bad43fc2b8e8e7a34ae2bb"} Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.474325 4707 scope.go:117] "RemoveContainer" containerID="10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.483746 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-667ddf5c59-6bpbq" podStartSLOduration=2.638297078 podStartE2EDuration="5.483719502s" podCreationTimestamp="2026-02-18 06:06:46 +0000 UTC" firstStartedPulling="2026-02-18 06:06:47.820960047 +0000 UTC m=+1144.468919191" lastFinishedPulling="2026-02-18 06:06:50.666382481 +0000 UTC m=+1147.314341615" observedRunningTime="2026-02-18 06:06:51.469455248 +0000 UTC m=+1148.117414382" watchObservedRunningTime="2026-02-18 06:06:51.483719502 +0000 UTC m=+1148.131678626" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.509016 4707 scope.go:117] "RemoveContainer" containerID="8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.509135 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hx5fl"] Feb 18 06:06:51 crc kubenswrapper[4707]: E0218 06:06:51.516882 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4\": container with ID starting with 8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4 not found: ID does not exist" containerID="8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.517079 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4"} err="failed to get container status \"8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4\": rpc error: code = NotFound desc = could not find container \"8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4\": container with ID starting with 8bc1cbe8f5095f0ee730c26623b517d4a31f363f32c42fe664387f5f8f950ce4 not found: ID does not exist" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.517107 4707 scope.go:117] "RemoveContainer" containerID="10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d" Feb 18 06:06:51 crc kubenswrapper[4707]: E0218 06:06:51.519347 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d\": container with ID starting with 10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d not found: ID does not exist" containerID="10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.519386 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d"} err="failed to get container status \"10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d\": rpc error: code = NotFound desc = could not find container \"10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d\": container with ID starting with 10c875be18a85fcc30b217c1f17d94e7455c75ab26afb1081977690fa9838e5d not found: ID does not exist" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.521456 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hx5fl"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.570664 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5698b9dc-6jxrc"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.585598 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.588323 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f5698b9dc-6jxrc"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.671978 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-767657c548-8lcwd"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.672346 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-767657c548-8lcwd" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-api" containerID="cri-o://70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a" gracePeriod=30 Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.824395 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:51 crc kubenswrapper[4707]: W0218 06:06:51.851953 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3621d2e8_f3b4_41d6_9386_e151527d23d3.slice/crio-09a47cbf51f4c266240e4a1a4a43b92404f7d1217648dc6c21e28a3241a4d599 WatchSource:0}: Error finding container 09a47cbf51f4c266240e4a1a4a43b92404f7d1217648dc6c21e28a3241a4d599: Status 404 returned error can't find the container with id 09a47cbf51f4c266240e4a1a4a43b92404f7d1217648dc6c21e28a3241a4d599 Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.907635 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.932740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dcc9fdf5-4rnck"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.949353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.983853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6998659dbc-vmh65"] Feb 18 06:06:51 crc kubenswrapper[4707]: E0218 06:06:51.984534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerName="dnsmasq-dns" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.984555 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerName="dnsmasq-dns" Feb 18 06:06:51 crc kubenswrapper[4707]: E0218 06:06:51.984576 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerName="init" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.984585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerName="init" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.984831 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" containerName="dnsmasq-dns" Feb 18 06:06:51 crc kubenswrapper[4707]: I0218 06:06:51.986049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.043600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6998659dbc-vmh65"] Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.051817 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106248 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4" path="/var/lib/kubelet/pods/b0e9ae52-3d76-43eb-a5b6-c2c9b4acd2e4/volumes" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-public-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf4cq\" (UniqueName: \"kubernetes.io/projected/0cb3d300-55b7-4aea-b732-2ab9a36ace83-kube-api-access-bf4cq\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-httpd-config\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-internal-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-ovndb-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-combined-ca-bundle\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106860 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea06ae2-b32c-42fe-9448-4eeb04a48947" path="/var/lib/kubelet/pods/bea06ae2-b32c-42fe-9448-4eeb04a48947/volumes" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.106945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-config\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.109679 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.155556 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.212887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "835f51c5-3996-4120-bd2f-4b3bef33c31d" (UID: "835f51c5-3996-4120-bd2f-4b3bef33c31d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.209754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-log-httpd\") pod \"835f51c5-3996-4120-bd2f-4b3bef33c31d\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.213571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-config-data\") pod \"835f51c5-3996-4120-bd2f-4b3bef33c31d\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.213741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ggsw\" (UniqueName: \"kubernetes.io/projected/835f51c5-3996-4120-bd2f-4b3bef33c31d-kube-api-access-7ggsw\") pod \"835f51c5-3996-4120-bd2f-4b3bef33c31d\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.213787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-combined-ca-bundle\") pod \"835f51c5-3996-4120-bd2f-4b3bef33c31d\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.213895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-run-httpd\") pod \"835f51c5-3996-4120-bd2f-4b3bef33c31d\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.214010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-sg-core-conf-yaml\") pod \"835f51c5-3996-4120-bd2f-4b3bef33c31d\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.214176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-scripts\") pod \"835f51c5-3996-4120-bd2f-4b3bef33c31d\" (UID: \"835f51c5-3996-4120-bd2f-4b3bef33c31d\") " Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.214785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-ovndb-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.215178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-combined-ca-bundle\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.215274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-config\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.215399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-public-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.215512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf4cq\" (UniqueName: \"kubernetes.io/projected/0cb3d300-55b7-4aea-b732-2ab9a36ace83-kube-api-access-bf4cq\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.215583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-httpd-config\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.215629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-internal-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.215819 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.224055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "835f51c5-3996-4120-bd2f-4b3bef33c31d" (UID: "835f51c5-3996-4120-bd2f-4b3bef33c31d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.228638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-public-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.235550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835f51c5-3996-4120-bd2f-4b3bef33c31d-kube-api-access-7ggsw" (OuterVolumeSpecName: "kube-api-access-7ggsw") pod "835f51c5-3996-4120-bd2f-4b3bef33c31d" (UID: "835f51c5-3996-4120-bd2f-4b3bef33c31d"). InnerVolumeSpecName "kube-api-access-7ggsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.235267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-combined-ca-bundle\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.249004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-scripts" (OuterVolumeSpecName: "scripts") pod "835f51c5-3996-4120-bd2f-4b3bef33c31d" (UID: "835f51c5-3996-4120-bd2f-4b3bef33c31d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.249649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf4cq\" (UniqueName: \"kubernetes.io/projected/0cb3d300-55b7-4aea-b732-2ab9a36ace83-kube-api-access-bf4cq\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.249875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-httpd-config\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.249943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-config\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.250424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-internal-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.251449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb3d300-55b7-4aea-b732-2ab9a36ace83-ovndb-tls-certs\") pod \"neutron-6998659dbc-vmh65\" (UID: \"0cb3d300-55b7-4aea-b732-2ab9a36ace83\") " pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.270430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "835f51c5-3996-4120-bd2f-4b3bef33c31d" (UID: "835f51c5-3996-4120-bd2f-4b3bef33c31d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.319123 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ggsw\" (UniqueName: \"kubernetes.io/projected/835f51c5-3996-4120-bd2f-4b3bef33c31d-kube-api-access-7ggsw\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.319183 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835f51c5-3996-4120-bd2f-4b3bef33c31d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.319200 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.319213 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.326322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "835f51c5-3996-4120-bd2f-4b3bef33c31d" (UID: "835f51c5-3996-4120-bd2f-4b3bef33c31d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.341552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-config-data" (OuterVolumeSpecName: "config-data") pod "835f51c5-3996-4120-bd2f-4b3bef33c31d" (UID: "835f51c5-3996-4120-bd2f-4b3bef33c31d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.418255 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.421325 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.421373 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f51c5-3996-4120-bd2f-4b3bef33c31d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.481974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" event={"ID":"b16501ed-460b-4a5c-8d59-9acddd5e1011","Type":"ContainerStarted","Data":"d8351e21407b08f3155578c5b209243d4d23fa519c289520191d9a6faefca1d3"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.485871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e4c49d31-bd00-4baf-89c8-69de0c3a4a50","Type":"ContainerStarted","Data":"94c346a61fb945c1a3a3af43417b554b87944141664b354d2942fb82b60ebff4"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.488290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88ae39c5-0688-4fc3-8570-9069549bf56c","Type":"ContainerStarted","Data":"132b8fdaa2cb047f4d0b4278eac0c92253d4b408c71bf671005f1cf28c1cecf3"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.490119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2760b874-f860-4b57-9cc3-91c3effda0cc","Type":"ContainerStarted","Data":"1c3d4f5c104fa2389111c75bf6eb56884d8d51ed169fdbee078a2b855a4dc741"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.493199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4976b217-8ee1-4ef9-9ee8-93101252adcb","Type":"ContainerStarted","Data":"4faa6cdec630374a56c6eb65602b120242f97212a5b7f7fa4589aab918d1fafa"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.510487 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-747894f44d-cbjjh" podStartSLOduration=3.405059752 podStartE2EDuration="6.510422377s" podCreationTimestamp="2026-02-18 06:06:46 +0000 UTC" firstStartedPulling="2026-02-18 06:06:47.517327804 +0000 UTC m=+1144.165286938" lastFinishedPulling="2026-02-18 06:06:50.622690429 +0000 UTC m=+1147.270649563" observedRunningTime="2026-02-18 06:06:52.504617212 +0000 UTC m=+1149.152576346" watchObservedRunningTime="2026-02-18 06:06:52.510422377 +0000 UTC m=+1149.158381521" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.515905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"835f51c5-3996-4120-bd2f-4b3bef33c31d","Type":"ContainerDied","Data":"2b9450f88020908d3d06a4f4c8f64efc597db4aa9b69a0fc71b21bcbc35f64c4"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.516068 4707 scope.go:117] "RemoveContainer" containerID="334205e1000ab08aadced1ad4efb96cad86cdc4b0b6b52ddf4f6aa24fe3d6163" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.520557 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.569041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6c6eadc8-c99a-4f6c-877a-d7974f8994bd","Type":"ContainerStarted","Data":"a532bd6dfbf820c2a97cdc25d6fdb5ad81a4b4f6cec00203f6a82c02ccdd2ff6"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.577903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"934db458-63c7-4b69-b4bf-70532ff71312","Type":"ContainerStarted","Data":"f1b6cb8c0a170a9490c293843fe57ecaccec4618fa9c160d8b2a1b5767e4a88a"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.596902 4707 generic.go:334] "Generic (PLEG): container finished" podID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerID="5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986" exitCode=0 Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.597001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" event={"ID":"3621d2e8-f3b4-41d6-9386-e151527d23d3","Type":"ContainerDied","Data":"5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.597039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" event={"ID":"3621d2e8-f3b4-41d6-9386-e151527d23d3","Type":"ContainerStarted","Data":"09a47cbf51f4c266240e4a1a4a43b92404f7d1217648dc6c21e28a3241a4d599"} Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.916495 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.941635 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.997438 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:52 crc kubenswrapper[4707]: E0218 06:06:52.998016 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="proxy-httpd" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.998034 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="proxy-httpd" Feb 18 06:06:52 crc kubenswrapper[4707]: E0218 06:06:52.998060 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="ceilometer-notification-agent" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.998068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="ceilometer-notification-agent" Feb 18 06:06:52 crc kubenswrapper[4707]: E0218 06:06:52.998126 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="sg-core" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.998137 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="sg-core" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.998894 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="proxy-httpd" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.998936 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="ceilometer-notification-agent" Feb 18 06:06:52 crc kubenswrapper[4707]: I0218 06:06:52.998963 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" containerName="sg-core" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.001776 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.004056 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.006315 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.024689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.045058 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.154297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.154727 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-config-data\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.154840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-log-httpd\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.154901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-run-httpd\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.154939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.154983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-scripts\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.155005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjxl\" (UniqueName: \"kubernetes.io/projected/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-kube-api-access-6vjxl\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.256397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-log-httpd\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.256469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-run-httpd\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.256507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.256545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-scripts\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.256568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjxl\" (UniqueName: \"kubernetes.io/projected/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-kube-api-access-6vjxl\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.256644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.256680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-config-data\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.259912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-log-httpd\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.260727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-run-httpd\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.265447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.265477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-scripts\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.267123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-config-data\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.267825 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.288019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjxl\" (UniqueName: \"kubernetes.io/projected/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-kube-api-access-6vjxl\") pod \"ceilometer-0\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.294954 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6998659dbc-vmh65"] Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.336421 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77db99878b-h8xzs" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.349544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.407501 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.414049 4707 scope.go:117] "RemoveContainer" containerID="25585314e4b53e676ff01370bfa90e26dce3f9c36989ea869fae981da6c662c9" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.465614 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-584c97fdd8-f4pbz"] Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.466362 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-584c97fdd8-f4pbz" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon-log" containerID="cri-o://29a0f7ee99a161c9430446c37f0a44e132007f8c079b432ace3e5f0964a9f2cc" gracePeriod=30 Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.467005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-584c97fdd8-f4pbz" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon" containerID="cri-o://22c76c3f5497afbf94a962ad1ce73ddde6f0928e5b98bac214feb82c9ced4db8" gracePeriod=30 Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.568198 4707 scope.go:117] "RemoveContainer" containerID="ba6be017b8075d81c56f6f49cf226d8481b69e5c48326636118cd0f0f5040bb4" Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.662581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6c6eadc8-c99a-4f6c-877a-d7974f8994bd","Type":"ContainerStarted","Data":"89af1468265cedf916412573bfcd74fefdb7831b0111f3aa8b1c2caf11820e91"} Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.681992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6998659dbc-vmh65" event={"ID":"0cb3d300-55b7-4aea-b732-2ab9a36ace83","Type":"ContainerStarted","Data":"35e63e0315b5254352d580f1c32b868366792a82bdbd1a2f6fbb2bee1de20446"} Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.713169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e4c49d31-bd00-4baf-89c8-69de0c3a4a50","Type":"ContainerStarted","Data":"ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5"} Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.716641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88ae39c5-0688-4fc3-8570-9069549bf56c","Type":"ContainerStarted","Data":"7f8ec52a681368c158f615277cae57f390e8c264dda58db38a2d9007f8d9a686"} Feb 18 06:06:53 crc kubenswrapper[4707]: I0218 06:06:53.724863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7e621198-27e7-4dce-aa75-3b16e6658b29","Type":"ContainerStarted","Data":"c427766e2c8556f225d8f4214c1254f3380bfe63cb1539afd94d2b093ad8cf33"} Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.075980 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835f51c5-3996-4120-bd2f-4b3bef33c31d" path="/var/lib/kubelet/pods/835f51c5-3996-4120-bd2f-4b3bef33c31d/volumes" Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.511361 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:06:54 crc kubenswrapper[4707]: W0218 06:06:54.592916 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a24a9ae_2d01_4cd6_94ed_ac92a17a4a66.slice/crio-763716bedb4610f516d7e19c0b7335252280a19ef31c66a97ab6569d7334a819 WatchSource:0}: Error finding container 763716bedb4610f516d7e19c0b7335252280a19ef31c66a97ab6569d7334a819: Status 404 returned error can't find the container with id 763716bedb4610f516d7e19c0b7335252280a19ef31c66a97ab6569d7334a819 Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.758977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" event={"ID":"3621d2e8-f3b4-41d6-9386-e151527d23d3","Type":"ContainerStarted","Data":"767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b"} Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.760658 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.779523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerStarted","Data":"763716bedb4610f516d7e19c0b7335252280a19ef31c66a97ab6569d7334a819"} Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.784244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e4c49d31-bd00-4baf-89c8-69de0c3a4a50","Type":"ContainerStarted","Data":"482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f"} Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.796653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6998659dbc-vmh65" event={"ID":"0cb3d300-55b7-4aea-b732-2ab9a36ace83","Type":"ContainerStarted","Data":"45f627221576fcc40592cf5bf37ab070768568194e8fb11c742748cdab85e5a3"} Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.802174 4707 generic.go:334] "Generic (PLEG): container finished" podID="fe56b299-664f-478b-9f30-8e2a4c457676" containerID="22c76c3f5497afbf94a962ad1ce73ddde6f0928e5b98bac214feb82c9ced4db8" exitCode=0 Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.802253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584c97fdd8-f4pbz" event={"ID":"fe56b299-664f-478b-9f30-8e2a4c457676","Type":"ContainerDied","Data":"22c76c3f5497afbf94a962ad1ce73ddde6f0928e5b98bac214feb82c9ced4db8"} Feb 18 06:06:54 crc kubenswrapper[4707]: I0218 06:06:54.823059 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" podStartSLOduration=5.823037551 podStartE2EDuration="5.823037551s" podCreationTimestamp="2026-02-18 06:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:54.780001397 +0000 UTC m=+1151.427960531" watchObservedRunningTime="2026-02-18 06:06:54.823037551 +0000 UTC m=+1151.470996685" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.715222 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/2.log" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.717659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.744620 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=6.604084518 podStartE2EDuration="7.744597487s" podCreationTimestamp="2026-02-18 06:06:48 +0000 UTC" firstStartedPulling="2026-02-18 06:06:51.426970559 +0000 UTC m=+1148.074929703" lastFinishedPulling="2026-02-18 06:06:52.567483538 +0000 UTC m=+1149.215442672" observedRunningTime="2026-02-18 06:06:54.815990791 +0000 UTC m=+1151.463949925" watchObservedRunningTime="2026-02-18 06:06:55.744597487 +0000 UTC m=+1152.392556621" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.820387 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767657c548-8lcwd_c861568d-c8d4-49f4-9e5d-524c9445b152/neutron-httpd/2.log" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.820812 4707 generic.go:334] "Generic (PLEG): container finished" podID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerID="70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a" exitCode=0 Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.820877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767657c548-8lcwd" event={"ID":"c861568d-c8d4-49f4-9e5d-524c9445b152","Type":"ContainerDied","Data":"70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a"} Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.820933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767657c548-8lcwd" event={"ID":"c861568d-c8d4-49f4-9e5d-524c9445b152","Type":"ContainerDied","Data":"913160af363ffd3cf1aee395a56c62091c2624d4ebdd1bfbc06f99c60292676e"} Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.820951 4707 scope.go:117] "RemoveContainer" containerID="8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.821055 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-767657c548-8lcwd" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.831656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6c6eadc8-c99a-4f6c-877a-d7974f8994bd","Type":"ContainerStarted","Data":"9e2c35573f42ce56afbb7cf93394b1f9ba4a1c1850840e3d52cc2536b2f40be7"} Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.831957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.852993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6998659dbc-vmh65" event={"ID":"0cb3d300-55b7-4aea-b732-2ab9a36ace83","Type":"ContainerStarted","Data":"f67fd413a0c83d83507ad7a169cfb446ce0b37f116e7c57a2ea43e0edb05bbc5"} Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.853385 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.860887 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.860854365 podStartE2EDuration="6.860854365s" podCreationTimestamp="2026-02-18 06:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:55.853549159 +0000 UTC m=+1152.501508293" watchObservedRunningTime="2026-02-18 06:06:55.860854365 +0000 UTC m=+1152.508813499" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.866286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88ae39c5-0688-4fc3-8570-9069549bf56c","Type":"ContainerStarted","Data":"50a1fbeca613a91bf0cac63a48561a6138383f72b7bfb9b9f0bb15dfc2e4a689"} Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.866524 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api-log" containerID="cri-o://7f8ec52a681368c158f615277cae57f390e8c264dda58db38a2d9007f8d9a686" gracePeriod=30 Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.866871 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api" containerID="cri-o://50a1fbeca613a91bf0cac63a48561a6138383f72b7bfb9b9f0bb15dfc2e4a689" gracePeriod=30 Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.866897 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.879770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7e621198-27e7-4dce-aa75-3b16e6658b29","Type":"ContainerStarted","Data":"3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79"} Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.884380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-config\") pod \"c861568d-c8d4-49f4-9e5d-524c9445b152\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.884439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-ovndb-tls-certs\") pod \"c861568d-c8d4-49f4-9e5d-524c9445b152\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.884528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-httpd-config\") pod \"c861568d-c8d4-49f4-9e5d-524c9445b152\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.884572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmm9k\" (UniqueName: \"kubernetes.io/projected/c861568d-c8d4-49f4-9e5d-524c9445b152-kube-api-access-zmm9k\") pod \"c861568d-c8d4-49f4-9e5d-524c9445b152\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.884635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-combined-ca-bundle\") pod \"c861568d-c8d4-49f4-9e5d-524c9445b152\" (UID: \"c861568d-c8d4-49f4-9e5d-524c9445b152\") " Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.896420 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6998659dbc-vmh65" podStartSLOduration=4.896385837 podStartE2EDuration="4.896385837s" podCreationTimestamp="2026-02-18 06:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:55.87822618 +0000 UTC m=+1152.526185314" watchObservedRunningTime="2026-02-18 06:06:55.896385837 +0000 UTC m=+1152.544344961" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.904843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c861568d-c8d4-49f4-9e5d-524c9445b152" (UID: "c861568d-c8d4-49f4-9e5d-524c9445b152"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.904890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c861568d-c8d4-49f4-9e5d-524c9445b152-kube-api-access-zmm9k" (OuterVolumeSpecName: "kube-api-access-zmm9k") pod "c861568d-c8d4-49f4-9e5d-524c9445b152" (UID: "c861568d-c8d4-49f4-9e5d-524c9445b152"). InnerVolumeSpecName "kube-api-access-zmm9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.940178 4707 scope.go:117] "RemoveContainer" containerID="70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.996948 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:55 crc kubenswrapper[4707]: I0218 06:06:55.996972 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmm9k\" (UniqueName: \"kubernetes.io/projected/c861568d-c8d4-49f4-9e5d-524c9445b152-kube-api-access-zmm9k\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.170559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-config" (OuterVolumeSpecName: "config") pod "c861568d-c8d4-49f4-9e5d-524c9445b152" (UID: "c861568d-c8d4-49f4-9e5d-524c9445b152"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.173225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c861568d-c8d4-49f4-9e5d-524c9445b152" (UID: "c861568d-c8d4-49f4-9e5d-524c9445b152"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.201139 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.201445 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.202893 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.202876697 podStartE2EDuration="7.202876697s" podCreationTimestamp="2026-02-18 06:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:06:55.928522909 +0000 UTC m=+1152.576482043" watchObservedRunningTime="2026-02-18 06:06:56.202876697 +0000 UTC m=+1152.850835831" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.218411 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.223767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c861568d-c8d4-49f4-9e5d-524c9445b152" (UID: "c861568d-c8d4-49f4-9e5d-524c9445b152"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.279958 4707 scope.go:117] "RemoveContainer" containerID="8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6" Feb 18 06:06:56 crc kubenswrapper[4707]: E0218 06:06:56.287083 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6\": container with ID starting with 8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6 not found: ID does not exist" containerID="8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.287135 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6"} err="failed to get container status \"8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6\": rpc error: code = NotFound desc = could not find container \"8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6\": container with ID starting with 8d8a33b50d106118d6159c0728b17e1773072d686a97a7f4a982b985022ac0a6 not found: ID does not exist" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.287161 4707 scope.go:117] "RemoveContainer" containerID="70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a" Feb 18 06:06:56 crc kubenswrapper[4707]: E0218 06:06:56.287862 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a\": container with ID starting with 70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a not found: ID does not exist" containerID="70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.287886 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a"} err="failed to get container status \"70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a\": rpc error: code = NotFound desc = could not find container \"70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a\": container with ID starting with 70cdee7903a29e07ee576308c0965dc8e3f3863bc54a4af7117c42c132dcd84a not found: ID does not exist" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.305171 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c861568d-c8d4-49f4-9e5d-524c9445b152-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.482531 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-767657c548-8lcwd"] Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.496193 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-767657c548-8lcwd"] Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.921870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2760b874-f860-4b57-9cc3-91c3effda0cc","Type":"ContainerStarted","Data":"de6b869af8e0dadae190e774446785585f7a0672557cc8c031aeccc15d8a31e6"} Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.938005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerStarted","Data":"1459c6da7524ff6db4d16ac1bdd2cc5e23efac69add9ab6646657dc14ba952d3"} Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.942646 4707 generic.go:334] "Generic (PLEG): container finished" podID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerID="50a1fbeca613a91bf0cac63a48561a6138383f72b7bfb9b9f0bb15dfc2e4a689" exitCode=0 Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.942667 4707 generic.go:334] "Generic (PLEG): container finished" podID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerID="7f8ec52a681368c158f615277cae57f390e8c264dda58db38a2d9007f8d9a686" exitCode=143 Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.942702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88ae39c5-0688-4fc3-8570-9069549bf56c","Type":"ContainerDied","Data":"50a1fbeca613a91bf0cac63a48561a6138383f72b7bfb9b9f0bb15dfc2e4a689"} Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.942720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88ae39c5-0688-4fc3-8570-9069549bf56c","Type":"ContainerDied","Data":"7f8ec52a681368c158f615277cae57f390e8c264dda58db38a2d9007f8d9a686"} Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.970071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7e621198-27e7-4dce-aa75-3b16e6658b29","Type":"ContainerStarted","Data":"61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c"} Feb 18 06:06:56 crc kubenswrapper[4707]: I0218 06:06:56.977931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4976b217-8ee1-4ef9-9ee8-93101252adcb","Type":"ContainerStarted","Data":"ca3a49015971daede027e9d4cd15fc49ae714693732dfae9fc58591cbe490a75"} Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.013641 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=7.885931817 podStartE2EDuration="9.013618751s" podCreationTimestamp="2026-02-18 06:06:48 +0000 UTC" firstStartedPulling="2026-02-18 06:06:53.086079906 +0000 UTC m=+1149.734039030" lastFinishedPulling="2026-02-18 06:06:54.21376683 +0000 UTC m=+1150.861725964" observedRunningTime="2026-02-18 06:06:57.005694488 +0000 UTC m=+1153.653653612" watchObservedRunningTime="2026-02-18 06:06:57.013618751 +0000 UTC m=+1153.661577885" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.212967 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.362989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data-custom\") pod \"88ae39c5-0688-4fc3-8570-9069549bf56c\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.363252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data\") pod \"88ae39c5-0688-4fc3-8570-9069549bf56c\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.363468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88ae39c5-0688-4fc3-8570-9069549bf56c-etc-machine-id\") pod \"88ae39c5-0688-4fc3-8570-9069549bf56c\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.363610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-combined-ca-bundle\") pod \"88ae39c5-0688-4fc3-8570-9069549bf56c\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.363878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdkbh\" (UniqueName: \"kubernetes.io/projected/88ae39c5-0688-4fc3-8570-9069549bf56c-kube-api-access-vdkbh\") pod \"88ae39c5-0688-4fc3-8570-9069549bf56c\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.363954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-scripts\") pod \"88ae39c5-0688-4fc3-8570-9069549bf56c\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.363985 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88ae39c5-0688-4fc3-8570-9069549bf56c-logs\") pod \"88ae39c5-0688-4fc3-8570-9069549bf56c\" (UID: \"88ae39c5-0688-4fc3-8570-9069549bf56c\") " Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.364830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88ae39c5-0688-4fc3-8570-9069549bf56c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "88ae39c5-0688-4fc3-8570-9069549bf56c" (UID: "88ae39c5-0688-4fc3-8570-9069549bf56c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.373614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ae39c5-0688-4fc3-8570-9069549bf56c-logs" (OuterVolumeSpecName: "logs") pod "88ae39c5-0688-4fc3-8570-9069549bf56c" (UID: "88ae39c5-0688-4fc3-8570-9069549bf56c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.376384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "88ae39c5-0688-4fc3-8570-9069549bf56c" (UID: "88ae39c5-0688-4fc3-8570-9069549bf56c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.379011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ae39c5-0688-4fc3-8570-9069549bf56c-kube-api-access-vdkbh" (OuterVolumeSpecName: "kube-api-access-vdkbh") pod "88ae39c5-0688-4fc3-8570-9069549bf56c" (UID: "88ae39c5-0688-4fc3-8570-9069549bf56c"). InnerVolumeSpecName "kube-api-access-vdkbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.380210 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88ae39c5-0688-4fc3-8570-9069549bf56c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.380304 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdkbh\" (UniqueName: \"kubernetes.io/projected/88ae39c5-0688-4fc3-8570-9069549bf56c-kube-api-access-vdkbh\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.380367 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88ae39c5-0688-4fc3-8570-9069549bf56c-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.380435 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.384013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-scripts" (OuterVolumeSpecName: "scripts") pod "88ae39c5-0688-4fc3-8570-9069549bf56c" (UID: "88ae39c5-0688-4fc3-8570-9069549bf56c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.410188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88ae39c5-0688-4fc3-8570-9069549bf56c" (UID: "88ae39c5-0688-4fc3-8570-9069549bf56c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.470310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data" (OuterVolumeSpecName: "config-data") pod "88ae39c5-0688-4fc3-8570-9069549bf56c" (UID: "88ae39c5-0688-4fc3-8570-9069549bf56c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.482918 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.483152 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.483222 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ae39c5-0688-4fc3-8570-9069549bf56c-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.702225 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-fc8b85554-bcs7j"] Feb 18 06:06:57 crc kubenswrapper[4707]: E0218 06:06:57.702678 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api-log" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.702694 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api-log" Feb 18 06:06:57 crc kubenswrapper[4707]: E0218 06:06:57.702706 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.702713 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: E0218 06:06:57.702732 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.702738 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: E0218 06:06:57.702761 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.702767 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api" Feb 18 06:06:57 crc kubenswrapper[4707]: E0218 06:06:57.702778 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-api" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.702785 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-api" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.703005 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.703020 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-api" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.703030 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.703037 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.703046 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" containerName="cinder-api-log" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.703055 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: E0218 06:06:57.703227 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.703240 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" containerName="neutron-httpd" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.704103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.715137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-fc8b85554-bcs7j"] Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.715613 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.715784 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.788857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-internal-tls-certs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.788951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-config-data-custom\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.788993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd62ec-d5ea-4ad3-8020-0cc244072675-logs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.789030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-config-data\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.789046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-public-tls-certs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.789085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-combined-ca-bundle\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.789105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ngk\" (UniqueName: \"kubernetes.io/projected/13bd62ec-d5ea-4ad3-8020-0cc244072675-kube-api-access-g5ngk\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.890970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5ngk\" (UniqueName: \"kubernetes.io/projected/13bd62ec-d5ea-4ad3-8020-0cc244072675-kube-api-access-g5ngk\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.891159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-internal-tls-certs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.891286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-config-data-custom\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.891351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd62ec-d5ea-4ad3-8020-0cc244072675-logs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.891414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-config-data\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.891444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-public-tls-certs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.891511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-combined-ca-bundle\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.891913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd62ec-d5ea-4ad3-8020-0cc244072675-logs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.901164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-combined-ca-bundle\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.901433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-config-data\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.905339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-internal-tls-certs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.913401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-config-data-custom\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.917497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5ngk\" (UniqueName: \"kubernetes.io/projected/13bd62ec-d5ea-4ad3-8020-0cc244072675-kube-api-access-g5ngk\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:57 crc kubenswrapper[4707]: I0218 06:06:57.920328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13bd62ec-d5ea-4ad3-8020-0cc244072675-public-tls-certs\") pod \"barbican-api-fc8b85554-bcs7j\" (UID: \"13bd62ec-d5ea-4ad3-8020-0cc244072675\") " pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.026133 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api-log" containerID="cri-o://89af1468265cedf916412573bfcd74fefdb7831b0111f3aa8b1c2caf11820e91" gracePeriod=30 Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.026687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.030003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88ae39c5-0688-4fc3-8570-9069549bf56c","Type":"ContainerDied","Data":"132b8fdaa2cb047f4d0b4278eac0c92253d4b408c71bf671005f1cf28c1cecf3"} Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.040703 4707 scope.go:117] "RemoveContainer" containerID="50a1fbeca613a91bf0cac63a48561a6138383f72b7bfb9b9f0bb15dfc2e4a689" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.033927 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api" containerID="cri-o://9e2c35573f42ce56afbb7cf93394b1f9ba4a1c1850840e3d52cc2536b2f40be7" gracePeriod=30 Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.044746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.298858 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c861568d-c8d4-49f4-9e5d-524c9445b152" path="/var/lib/kubelet/pods/c861568d-c8d4-49f4-9e5d-524c9445b152/volumes" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.321857 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.333520 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.361186 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.362698 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.365496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.366245 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.369924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.409875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.512287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f03e391-db4f-46dd-b206-94e9f6d65e68-logs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.512347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f03e391-db4f-46dd-b206-94e9f6d65e68-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.512380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj8qw\" (UniqueName: \"kubernetes.io/projected/6f03e391-db4f-46dd-b206-94e9f6d65e68-kube-api-access-kj8qw\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.512589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.512994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-scripts\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.513052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.513194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.513231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-config-data\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.513248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.614948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-scripts\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-config-data\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615260 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f03e391-db4f-46dd-b206-94e9f6d65e68-logs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f03e391-db4f-46dd-b206-94e9f6d65e68-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.615323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj8qw\" (UniqueName: \"kubernetes.io/projected/6f03e391-db4f-46dd-b206-94e9f6d65e68-kube-api-access-kj8qw\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.616573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f03e391-db4f-46dd-b206-94e9f6d65e68-logs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.616641 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f03e391-db4f-46dd-b206-94e9f6d65e68-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.623714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.627633 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.632473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-config-data\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.650231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.650341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.650732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f03e391-db4f-46dd-b206-94e9f6d65e68-scripts\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.660081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj8qw\" (UniqueName: \"kubernetes.io/projected/6f03e391-db4f-46dd-b206-94e9f6d65e68-kube-api-access-kj8qw\") pod \"cinder-api-0\" (UID: \"6f03e391-db4f-46dd-b206-94e9f6d65e68\") " pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.709034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 06:06:58 crc kubenswrapper[4707]: I0218 06:06:58.822074 4707 scope.go:117] "RemoveContainer" containerID="7f8ec52a681368c158f615277cae57f390e8c264dda58db38a2d9007f8d9a686" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.089174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4976b217-8ee1-4ef9-9ee8-93101252adcb","Type":"ContainerStarted","Data":"655965d1a64e084eae3f847542294bf2a7d235639aa4c827610cd6f459b714d6"} Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.125843 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=9.409038016 podStartE2EDuration="11.125785599s" podCreationTimestamp="2026-02-18 06:06:48 +0000 UTC" firstStartedPulling="2026-02-18 06:06:51.951506577 +0000 UTC m=+1148.599465701" lastFinishedPulling="2026-02-18 06:06:53.66825415 +0000 UTC m=+1150.316213284" observedRunningTime="2026-02-18 06:06:59.121442002 +0000 UTC m=+1155.769401136" watchObservedRunningTime="2026-02-18 06:06:59.125785599 +0000 UTC m=+1155.773744733" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.135031 4707 generic.go:334] "Generic (PLEG): container finished" podID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerID="9e2c35573f42ce56afbb7cf93394b1f9ba4a1c1850840e3d52cc2536b2f40be7" exitCode=0 Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.135073 4707 generic.go:334] "Generic (PLEG): container finished" podID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerID="89af1468265cedf916412573bfcd74fefdb7831b0111f3aa8b1c2caf11820e91" exitCode=143 Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.135150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6c6eadc8-c99a-4f6c-877a-d7974f8994bd","Type":"ContainerDied","Data":"9e2c35573f42ce56afbb7cf93394b1f9ba4a1c1850840e3d52cc2536b2f40be7"} Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.135183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6c6eadc8-c99a-4f6c-877a-d7974f8994bd","Type":"ContainerDied","Data":"89af1468265cedf916412573bfcd74fefdb7831b0111f3aa8b1c2caf11820e91"} Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.193536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2760b874-f860-4b57-9cc3-91c3effda0cc","Type":"ContainerStarted","Data":"e3f69d3a09f1bbc10c2f9d583280700e75e893c70d77b390dece188f4fa0e5ea"} Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.224468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerStarted","Data":"5e7bfb230e7c8682920b6771fed5d956fda830cca53a74632e3f724b8c71cc00"} Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.243732 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.569936402 podStartE2EDuration="11.243710862s" podCreationTimestamp="2026-02-18 06:06:48 +0000 UTC" firstStartedPulling="2026-02-18 06:06:51.894509788 +0000 UTC m=+1148.542468922" lastFinishedPulling="2026-02-18 06:06:53.568284248 +0000 UTC m=+1150.216243382" observedRunningTime="2026-02-18 06:06:59.223011147 +0000 UTC m=+1155.870970301" watchObservedRunningTime="2026-02-18 06:06:59.243710862 +0000 UTC m=+1155.891669996" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.427891 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.442119 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.544757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.546257 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-combined-ca-bundle\") pod \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.546358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-scripts\") pod \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.546381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data-custom\") pod \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.546447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hknx\" (UniqueName: \"kubernetes.io/projected/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-kube-api-access-6hknx\") pod \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.546527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-logs\") pod \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.546695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data\") pod \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.546752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-etc-machine-id\") pod \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\" (UID: \"6c6eadc8-c99a-4f6c-877a-d7974f8994bd\") " Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.547290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-logs" (OuterVolumeSpecName: "logs") pod "6c6eadc8-c99a-4f6c-877a-d7974f8994bd" (UID: "6c6eadc8-c99a-4f6c-877a-d7974f8994bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.548907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6c6eadc8-c99a-4f6c-877a-d7974f8994bd" (UID: "6c6eadc8-c99a-4f6c-877a-d7974f8994bd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.553442 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.553469 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.568877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-scripts" (OuterVolumeSpecName: "scripts") pod "6c6eadc8-c99a-4f6c-877a-d7974f8994bd" (UID: "6c6eadc8-c99a-4f6c-877a-d7974f8994bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.577123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6c6eadc8-c99a-4f6c-877a-d7974f8994bd" (UID: "6c6eadc8-c99a-4f6c-877a-d7974f8994bd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.577160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-kube-api-access-6hknx" (OuterVolumeSpecName: "kube-api-access-6hknx") pod "6c6eadc8-c99a-4f6c-877a-d7974f8994bd" (UID: "6c6eadc8-c99a-4f6c-877a-d7974f8994bd"). InnerVolumeSpecName "kube-api-access-6hknx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.735466 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.738220 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.738833 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hknx\" (UniqueName: \"kubernetes.io/projected/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-kube-api-access-6hknx\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.741198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c6eadc8-c99a-4f6c-877a-d7974f8994bd" (UID: "6c6eadc8-c99a-4f6c-877a-d7974f8994bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.811129 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.818925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.827016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data" (OuterVolumeSpecName: "config-data") pod "6c6eadc8-c99a-4f6c-877a-d7974f8994bd" (UID: "6c6eadc8-c99a-4f6c-877a-d7974f8994bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.856619 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.856991 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6eadc8-c99a-4f6c-877a-d7974f8994bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.905812 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 18 06:06:59 crc kubenswrapper[4707]: I0218 06:06:59.918084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-fc8b85554-bcs7j"] Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.039628 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9chfr"] Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.039883 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" podUID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerName="dnsmasq-dns" containerID="cri-o://b147222add7ebbbf08a4b2a17aeaed24640cd82bf7d862f1b394d19336761830" gracePeriod=10 Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.097362 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ae39c5-0688-4fc3-8570-9069549bf56c" path="/var/lib/kubelet/pods/88ae39c5-0688-4fc3-8570-9069549bf56c/volumes" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.261258 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.292125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6c6eadc8-c99a-4f6c-877a-d7974f8994bd","Type":"ContainerDied","Data":"a532bd6dfbf820c2a97cdc25d6fdb5ad81a4b4f6cec00203f6a82c02ccdd2ff6"} Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.292189 4707 scope.go:117] "RemoveContainer" containerID="9e2c35573f42ce56afbb7cf93394b1f9ba4a1c1850840e3d52cc2536b2f40be7" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.292382 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.307068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f03e391-db4f-46dd-b206-94e9f6d65e68","Type":"ContainerStarted","Data":"a3ab7306f443fe39a40c0d51e2d5707d5f49f5d4be79596cc2ec662de6f59a85"} Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.367541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerStarted","Data":"97cc33b24b8809dc241cb83bfc5c96fc0540e9a53361fce3b3d210d015b462ce"} Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.372933 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.395235 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.431453 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerID="b147222add7ebbbf08a4b2a17aeaed24640cd82bf7d862f1b394d19336761830" exitCode=0 Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.431553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" event={"ID":"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe","Type":"ContainerDied","Data":"b147222add7ebbbf08a4b2a17aeaed24640cd82bf7d862f1b394d19336761830"} Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.435230 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 18 06:07:00 crc kubenswrapper[4707]: E0218 06:07:00.435711 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api-log" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.435732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api-log" Feb 18 06:07:00 crc kubenswrapper[4707]: E0218 06:07:00.435763 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.435769 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.435947 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api-log" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.435964 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" containerName="manila-api" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.436921 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.460266 4707 scope.go:117] "RemoveContainer" containerID="89af1468265cedf916412573bfcd74fefdb7831b0111f3aa8b1c2caf11820e91" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.462781 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.462976 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.463567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.471353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-public-tls-certs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-internal-tls-certs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-config-data-custom\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd5cr\" (UniqueName: \"kubernetes.io/projected/79366b7f-24dd-4217-b2af-7350751ce6d3-kube-api-access-xd5cr\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-scripts\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79366b7f-24dd-4217-b2af-7350751ce6d3-etc-machine-id\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79366b7f-24dd-4217-b2af-7350751ce6d3-logs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.490712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-config-data\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.510982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fc8b85554-bcs7j" event={"ID":"13bd62ec-d5ea-4ad3-8020-0cc244072675","Type":"ContainerStarted","Data":"3e07233a45f9a81c26bd5c3811d47a5cd2e3ee211bed32fe935407328397bfc1"} Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79366b7f-24dd-4217-b2af-7350751ce6d3-logs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-config-data\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-public-tls-certs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-internal-tls-certs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-config-data-custom\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd5cr\" (UniqueName: \"kubernetes.io/projected/79366b7f-24dd-4217-b2af-7350751ce6d3-kube-api-access-xd5cr\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.594948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-scripts\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.595355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79366b7f-24dd-4217-b2af-7350751ce6d3-etc-machine-id\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.597551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79366b7f-24dd-4217-b2af-7350751ce6d3-logs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.602399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-config-data\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.603895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79366b7f-24dd-4217-b2af-7350751ce6d3-etc-machine-id\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.604995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.605746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-public-tls-certs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.606966 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.609623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-config-data-custom\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.614547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-scripts\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.619872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79366b7f-24dd-4217-b2af-7350751ce6d3-internal-tls-certs\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.624185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd5cr\" (UniqueName: \"kubernetes.io/projected/79366b7f-24dd-4217-b2af-7350751ce6d3-kube-api-access-xd5cr\") pod \"manila-api-0\" (UID: \"79366b7f-24dd-4217-b2af-7350751ce6d3\") " pod="openstack/manila-api-0" Feb 18 06:07:00 crc kubenswrapper[4707]: E0218 06:07:00.761327 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c6eadc8_c99a_4f6c_877a_d7974f8994bd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e5decf6_6bf6_42ad_b4bd_5eb235dd1bfe.slice/crio-conmon-b147222add7ebbbf08a4b2a17aeaed24640cd82bf7d862f1b394d19336761830.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c6eadc8_c99a_4f6c_877a_d7974f8994bd.slice/crio-a532bd6dfbf820c2a97cdc25d6fdb5ad81a4b4f6cec00203f6a82c02ccdd2ff6\": RecentStats: unable to find data in memory cache]" Feb 18 06:07:00 crc kubenswrapper[4707]: I0218 06:07:00.856859 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.190755 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.251504 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.389448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-sb\") pod \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.389587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-nb\") pod \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.389646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-svc\") pod \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.389680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2zzm\" (UniqueName: \"kubernetes.io/projected/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-kube-api-access-n2zzm\") pod \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.389761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-swift-storage-0\") pod \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.389827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-config\") pod \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\" (UID: \"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe\") " Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.398049 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.449160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-kube-api-access-n2zzm" (OuterVolumeSpecName: "kube-api-access-n2zzm") pod "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" (UID: "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe"). InnerVolumeSpecName "kube-api-access-n2zzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.494169 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2zzm\" (UniqueName: \"kubernetes.io/projected/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-kube-api-access-n2zzm\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.532363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" (UID: "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.597721 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.602909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" event={"ID":"7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe","Type":"ContainerDied","Data":"badd193811f65e9e27c883f2e29d286f816ac03e0324fdc593271c3934ddb7f5"} Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.602965 4707 scope.go:117] "RemoveContainer" containerID="b147222add7ebbbf08a4b2a17aeaed24640cd82bf7d862f1b394d19336761830" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.603074 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9chfr" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.608116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" (UID: "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.620214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fc8b85554-bcs7j" event={"ID":"13bd62ec-d5ea-4ad3-8020-0cc244072675","Type":"ContainerStarted","Data":"f06dfa36f9ede81e2608c656e7b1b058f96ace045ff6a2fce84bdf877988d838"} Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.620547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fc8b85554-bcs7j" event={"ID":"13bd62ec-d5ea-4ad3-8020-0cc244072675","Type":"ContainerStarted","Data":"0e8281a0f037239f5a850d8403c623c6638928d97f26967bb14bf700984f86be"} Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.621986 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.622148 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.644461 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="cinder-volume" containerID="cri-o://ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5" gracePeriod=30 Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.644736 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="probe" containerID="cri-o://482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f" gracePeriod=30 Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.695990 4707 scope.go:117] "RemoveContainer" containerID="e37a9ab49c4ce221b61837a42416bb6a3ef73d7b13c3090449465be7a843e5c1" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.707207 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.790322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" (UID: "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.809669 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.932478 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-fc8b85554-bcs7j" podStartSLOduration=4.932457553 podStartE2EDuration="4.932457553s" podCreationTimestamp="2026-02-18 06:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:01.67169978 +0000 UTC m=+1158.319658934" watchObservedRunningTime="2026-02-18 06:07:01.932457553 +0000 UTC m=+1158.580416687" Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.963258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 18 06:07:01 crc kubenswrapper[4707]: I0218 06:07:01.984839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-config" (OuterVolumeSpecName: "config") pod "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" (UID: "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.020112 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.036740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" (UID: "7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.092912 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6eadc8-c99a-4f6c-877a-d7974f8994bd" path="/var/lib/kubelet/pods/6c6eadc8-c99a-4f6c-877a-d7974f8994bd/volumes" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.123695 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.225871 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9chfr"] Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.233080 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9chfr"] Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.376396 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6cc5d5b844-m7q6c" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.553978 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77db99878b-h8xzs" podUID="6aa9efa8-e6b5-4307-89b1-8a67547a35e9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.670114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f03e391-db4f-46dd-b206-94e9f6d65e68","Type":"ContainerStarted","Data":"b146545f0bb523fc5ad2dc8ef192fc4f5cfa99f0b2049f06b271a7670c3c72e0"} Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.682036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"79366b7f-24dd-4217-b2af-7350751ce6d3","Type":"ContainerStarted","Data":"70b2a91c670dea76c31e5997f18b10b4ce0eacfbb0fd168d7989ff9bc9604490"} Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.692412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerStarted","Data":"5d656020fd25af66ea614a6b33160e616890c9b3d5f87c69e8c6df3649ff2d5d"} Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.692920 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.696082 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerID="ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5" exitCode=0 Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.697002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e4c49d31-bd00-4baf-89c8-69de0c3a4a50","Type":"ContainerDied","Data":"ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5"} Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.724254 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.485425606 podStartE2EDuration="10.724239449s" podCreationTimestamp="2026-02-18 06:06:52 +0000 UTC" firstStartedPulling="2026-02-18 06:06:54.598520159 +0000 UTC m=+1151.246479293" lastFinishedPulling="2026-02-18 06:07:01.837334002 +0000 UTC m=+1158.485293136" observedRunningTime="2026-02-18 06:07:02.721711501 +0000 UTC m=+1159.369670635" watchObservedRunningTime="2026-02-18 06:07:02.724239449 +0000 UTC m=+1159.372198583" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.731170 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.851031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-986f6fbf8-z89c7" Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.946194 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f79f4d956-gdrpq"] Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.946449 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f79f4d956-gdrpq" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-log" containerID="cri-o://e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55" gracePeriod=30 Feb 18 06:07:02 crc kubenswrapper[4707]: I0218 06:07:02.946856 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f79f4d956-gdrpq" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-api" containerID="cri-o://6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3" gracePeriod=30 Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.501787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.694753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-lib-modules\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.694823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-sys\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.694848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data-custom\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.694898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-lib-cinder\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.694967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-combined-ca-bundle\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.694993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-brick\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-nvme\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-run\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r44k2\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-kube-api-access-r44k2\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-dev\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-iscsi\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-machine-id\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-scripts\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-cinder\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.695519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-ceph\") pod \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\" (UID: \"e4c49d31-bd00-4baf-89c8-69de0c3a4a50\") " Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.696724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-sys" (OuterVolumeSpecName: "sys") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.696767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.698245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.698336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.698996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.699080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-dev" (OuterVolumeSpecName: "dev") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.699104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.699126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-run" (OuterVolumeSpecName: "run") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.699145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.699607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.704648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-kube-api-access-r44k2" (OuterVolumeSpecName: "kube-api-access-r44k2") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "kube-api-access-r44k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.711142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-scripts" (OuterVolumeSpecName: "scripts") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.719089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.719266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"79366b7f-24dd-4217-b2af-7350751ce6d3","Type":"ContainerStarted","Data":"1ec1560e851150a479e0f3fcfeec611109bba1e1e946722558b09de36e613a3b"} Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.728226 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerID="482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f" exitCode=0 Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.728300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e4c49d31-bd00-4baf-89c8-69de0c3a4a50","Type":"ContainerDied","Data":"482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f"} Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.728329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e4c49d31-bd00-4baf-89c8-69de0c3a4a50","Type":"ContainerDied","Data":"94c346a61fb945c1a3a3af43417b554b87944141664b354d2942fb82b60ebff4"} Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.728349 4707 scope.go:117] "RemoveContainer" containerID="482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.728479 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.736929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f03e391-db4f-46dd-b206-94e9f6d65e68","Type":"ContainerStarted","Data":"f30e7dafdbfa24faa1cbc103b327c61f4e7ef00abf0dcae3bab0a0a6a9df4272"} Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.737786 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.741158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-ceph" (OuterVolumeSpecName: "ceph") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.743028 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerID="e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55" exitCode=143 Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.743343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f79f4d956-gdrpq" event={"ID":"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f","Type":"ContainerDied","Data":"e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55"} Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.771842 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.7718126739999995 podStartE2EDuration="5.771812674s" podCreationTimestamp="2026-02-18 06:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:03.755678661 +0000 UTC m=+1160.403637795" watchObservedRunningTime="2026-02-18 06:07:03.771812674 +0000 UTC m=+1160.419771808" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797494 4707 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-ceph\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797530 4707 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797540 4707 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-sys\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797548 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797559 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797567 4707 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797574 4707 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797581 4707 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797589 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r44k2\" (UniqueName: \"kubernetes.io/projected/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-kube-api-access-r44k2\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797598 4707 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-dev\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797606 4707 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797615 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797623 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.797633 4707 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.802664 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.902325 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.949002 4707 scope.go:117] "RemoveContainer" containerID="ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.969502 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 18 06:07:03 crc kubenswrapper[4707]: E0218 06:07:03.970106 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerName="dnsmasq-dns" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.970122 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerName="dnsmasq-dns" Feb 18 06:07:03 crc kubenswrapper[4707]: E0218 06:07:03.970169 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="probe" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.970175 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="probe" Feb 18 06:07:03 crc kubenswrapper[4707]: E0218 06:07:03.970184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerName="init" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.970190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerName="init" Feb 18 06:07:03 crc kubenswrapper[4707]: E0218 06:07:03.970201 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="cinder-volume" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.970207 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="cinder-volume" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.970429 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="probe" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.970459 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" containerName="cinder-volume" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.970473 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" containerName="dnsmasq-dns" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.971332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.981824 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.982163 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.987400 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-ltw5g" Feb 18 06:07:03 crc kubenswrapper[4707]: I0218 06:07:03.987959 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.048815 4707 scope.go:117] "RemoveContainer" containerID="482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f" Feb 18 06:07:04 crc kubenswrapper[4707]: E0218 06:07:04.049608 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f\": container with ID starting with 482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f not found: ID does not exist" containerID="482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.049648 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f"} err="failed to get container status \"482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f\": rpc error: code = NotFound desc = could not find container \"482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f\": container with ID starting with 482cb1452418680804ec3c6d1e7c40d70a2001e401986174b126211b371cc80f not found: ID does not exist" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.049676 4707 scope.go:117] "RemoveContainer" containerID="ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5" Feb 18 06:07:04 crc kubenswrapper[4707]: E0218 06:07:04.054996 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5\": container with ID starting with ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5 not found: ID does not exist" containerID="ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.055080 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5"} err="failed to get container status \"ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5\": rpc error: code = NotFound desc = could not find container \"ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5\": container with ID starting with ca41c821569fb8f6327a8cf5c40ddab2d32377f525403ac235585d2e4d0743a5 not found: ID does not exist" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.087726 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe" path="/var/lib/kubelet/pods/7e5decf6-6bf6-42ad-b4bd-5eb235dd1bfe/volumes" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.099895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data" (OuterVolumeSpecName: "config-data") pod "e4c49d31-bd00-4baf-89c8-69de0c3a4a50" (UID: "e4c49d31-bd00-4baf-89c8-69de0c3a4a50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.107038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8848\" (UniqueName: \"kubernetes.io/projected/73dac699-5199-47dc-b173-8df7813c1ad4-kube-api-access-q8848\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.107127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/73dac699-5199-47dc-b173-8df7813c1ad4-openstack-config-secret\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.107230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/73dac699-5199-47dc-b173-8df7813c1ad4-openstack-config\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.107248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73dac699-5199-47dc-b173-8df7813c1ad4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.107347 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c49d31-bd00-4baf-89c8-69de0c3a4a50-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.209352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8848\" (UniqueName: \"kubernetes.io/projected/73dac699-5199-47dc-b173-8df7813c1ad4-kube-api-access-q8848\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.209399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/73dac699-5199-47dc-b173-8df7813c1ad4-openstack-config-secret\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.209521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/73dac699-5199-47dc-b173-8df7813c1ad4-openstack-config\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.209538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73dac699-5199-47dc-b173-8df7813c1ad4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.210690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/73dac699-5199-47dc-b173-8df7813c1ad4-openstack-config\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.213906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73dac699-5199-47dc-b173-8df7813c1ad4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.213947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/73dac699-5199-47dc-b173-8df7813c1ad4-openstack-config-secret\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.227104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8848\" (UniqueName: \"kubernetes.io/projected/73dac699-5199-47dc-b173-8df7813c1ad4-kube-api-access-q8848\") pod \"openstackclient\" (UID: \"73dac699-5199-47dc-b173-8df7813c1ad4\") " pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.321156 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.376876 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.387189 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.408419 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.419037 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.425146 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.455928 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbq27\" (UniqueName: \"kubernetes.io/projected/62efca2e-66ee-443d-910e-eb9c22f0536f-kube-api-access-sbq27\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/62efca2e-66ee-443d-910e-eb9c22f0536f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-run\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517736 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.517810 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbq27\" (UniqueName: \"kubernetes.io/projected/62efca2e-66ee-443d-910e-eb9c22f0536f-kube-api-access-sbq27\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/62efca2e-66ee-443d-910e-eb9c22f0536f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619819 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-run\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.619975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-run\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.620749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/62efca2e-66ee-443d-910e-eb9c22f0536f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.626026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.626518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/62efca2e-66ee-443d-910e-eb9c22f0536f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.627693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.629826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.632235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62efca2e-66ee-443d-910e-eb9c22f0536f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.641250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbq27\" (UniqueName: \"kubernetes.io/projected/62efca2e-66ee-443d-910e-eb9c22f0536f-kube-api-access-sbq27\") pod \"cinder-volume-volume1-0\" (UID: \"62efca2e-66ee-443d-910e-eb9c22f0536f\") " pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.734235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.759993 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.770465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"79366b7f-24dd-4217-b2af-7350751ce6d3","Type":"ContainerStarted","Data":"1493b5dd1103a610c5cd9ac3685a39a02b6230f879aa6f037994e4cea2027510"} Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.772530 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.782823 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.783053 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="cinder-scheduler" containerID="cri-o://de6b869af8e0dadae190e774446785585f7a0672557cc8c031aeccc15d8a31e6" gracePeriod=30 Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.783094 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="probe" containerID="cri-o://e3f69d3a09f1bbc10c2f9d583280700e75e893c70d77b390dece188f4fa0e5ea" gracePeriod=30 Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.827015 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.826975243 podStartE2EDuration="4.826975243s" podCreationTimestamp="2026-02-18 06:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:04.810355688 +0000 UTC m=+1161.458314822" watchObservedRunningTime="2026-02-18 06:07:04.826975243 +0000 UTC m=+1161.474934397" Feb 18 06:07:04 crc kubenswrapper[4707]: I0218 06:07:04.864008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.193389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.278654 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.378165 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 18 06:07:05 crc kubenswrapper[4707]: W0218 06:07:05.382916 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62efca2e_66ee_443d_910e_eb9c22f0536f.slice/crio-451497ec053d8e36f2de903d1721c4f007f87eb8704e67f967322c8df0e3ccd1 WatchSource:0}: Error finding container 451497ec053d8e36f2de903d1721c4f007f87eb8704e67f967322c8df0e3ccd1: Status 404 returned error can't find the container with id 451497ec053d8e36f2de903d1721c4f007f87eb8704e67f967322c8df0e3ccd1 Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.802208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"73dac699-5199-47dc-b173-8df7813c1ad4","Type":"ContainerStarted","Data":"5bbf475c3f4d44491b4a077399aaebb9fc772b0d7b0334808b47c8aeaac8e8dc"} Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.812866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"62efca2e-66ee-443d-910e-eb9c22f0536f","Type":"ContainerStarted","Data":"ed86feb0e9d7188a3b81a14462a81e2b216c56d65f9d30eac6c85fcd4854e558"} Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.812916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"62efca2e-66ee-443d-910e-eb9c22f0536f","Type":"ContainerStarted","Data":"451497ec053d8e36f2de903d1721c4f007f87eb8704e67f967322c8df0e3ccd1"} Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.813224 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="cinder-backup" containerID="cri-o://3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79" gracePeriod=30 Feb 18 06:07:05 crc kubenswrapper[4707]: I0218 06:07:05.813588 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="probe" containerID="cri-o://61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c" gracePeriod=30 Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.067220 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c49d31-bd00-4baf-89c8-69de0c3a4a50" path="/var/lib/kubelet/pods/e4c49d31-bd00-4baf-89c8-69de0c3a4a50/volumes" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.535817 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.675315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-logs\") pod \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.675391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-internal-tls-certs\") pod \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.675467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-scripts\") pod \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.675591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-config-data\") pod \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.675623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-public-tls-certs\") pod \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.675667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq5kc\" (UniqueName: \"kubernetes.io/projected/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-kube-api-access-sq5kc\") pod \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.675689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-combined-ca-bundle\") pod \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\" (UID: \"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f\") " Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.676557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-logs" (OuterVolumeSpecName: "logs") pod "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" (UID: "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.683035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-scripts" (OuterVolumeSpecName: "scripts") pod "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" (UID: "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.712978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-kube-api-access-sq5kc" (OuterVolumeSpecName: "kube-api-access-sq5kc") pod "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" (UID: "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f"). InnerVolumeSpecName "kube-api-access-sq5kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.773110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-config-data" (OuterVolumeSpecName: "config-data") pod "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" (UID: "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.785452 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.785501 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq5kc\" (UniqueName: \"kubernetes.io/projected/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-kube-api-access-sq5kc\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.785514 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.785523 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.813451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" (UID: "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.834667 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerID="3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79" exitCode=0 Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.834718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7e621198-27e7-4dce-aa75-3b16e6658b29","Type":"ContainerDied","Data":"3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79"} Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.840554 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerID="6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3" exitCode=0 Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.840657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f79f4d956-gdrpq" event={"ID":"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f","Type":"ContainerDied","Data":"6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3"} Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.840691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f79f4d956-gdrpq" event={"ID":"b2e17b52-2282-4ccb-ba32-06a1d10e6c6f","Type":"ContainerDied","Data":"40546d80fe707054739315f51cd066215cfbccad2c56e30729c28d87b3f2d906"} Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.840850 4707 scope.go:117] "RemoveContainer" containerID="6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.841157 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f79f4d956-gdrpq" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.847554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"62efca2e-66ee-443d-910e-eb9c22f0536f","Type":"ContainerStarted","Data":"d65873a13d2f970da9cbe2cf3b2d62ba40b1822fff52e9e9b7972cc478c92b31"} Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.853630 4707 generic.go:334] "Generic (PLEG): container finished" podID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerID="e3f69d3a09f1bbc10c2f9d583280700e75e893c70d77b390dece188f4fa0e5ea" exitCode=0 Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.853676 4707 generic.go:334] "Generic (PLEG): container finished" podID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerID="de6b869af8e0dadae190e774446785585f7a0672557cc8c031aeccc15d8a31e6" exitCode=0 Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.853680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2760b874-f860-4b57-9cc3-91c3effda0cc","Type":"ContainerDied","Data":"e3f69d3a09f1bbc10c2f9d583280700e75e893c70d77b390dece188f4fa0e5ea"} Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.853717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2760b874-f860-4b57-9cc3-91c3effda0cc","Type":"ContainerDied","Data":"de6b869af8e0dadae190e774446785585f7a0672557cc8c031aeccc15d8a31e6"} Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.870716 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.870695535 podStartE2EDuration="2.870695535s" podCreationTimestamp="2026-02-18 06:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:06.869857483 +0000 UTC m=+1163.517816617" watchObservedRunningTime="2026-02-18 06:07:06.870695535 +0000 UTC m=+1163.518654669" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.887738 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.901351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" (UID: "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.904950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" (UID: "b2e17b52-2282-4ccb-ba32-06a1d10e6c6f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.991344 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:06 crc kubenswrapper[4707]: I0218 06:07:06.991408 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.041369 4707 scope.go:117] "RemoveContainer" containerID="e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.072952 4707 scope.go:117] "RemoveContainer" containerID="6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3" Feb 18 06:07:07 crc kubenswrapper[4707]: E0218 06:07:07.075332 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3\": container with ID starting with 6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3 not found: ID does not exist" containerID="6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.075365 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3"} err="failed to get container status \"6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3\": rpc error: code = NotFound desc = could not find container \"6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3\": container with ID starting with 6a0405fb0c6c680af31d8ce7413c430589a45df300db5913a3982170e208cca3 not found: ID does not exist" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.075386 4707 scope.go:117] "RemoveContainer" containerID="e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55" Feb 18 06:07:07 crc kubenswrapper[4707]: E0218 06:07:07.076698 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55\": container with ID starting with e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55 not found: ID does not exist" containerID="e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.076723 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55"} err="failed to get container status \"e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55\": rpc error: code = NotFound desc = could not find container \"e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55\": container with ID starting with e83331e9ec2e1402918e0f76210713620d1c230b79bd807d147b59e3290aff55 not found: ID does not exist" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.193641 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f79f4d956-gdrpq"] Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.209466 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7f79f4d956-gdrpq"] Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.480289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.597615 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.612227 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7w5d\" (UniqueName: \"kubernetes.io/projected/2760b874-f860-4b57-9cc3-91c3effda0cc-kube-api-access-z7w5d\") pod \"2760b874-f860-4b57-9cc3-91c3effda0cc\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.612276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-scripts\") pod \"2760b874-f860-4b57-9cc3-91c3effda0cc\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.612354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2760b874-f860-4b57-9cc3-91c3effda0cc-etc-machine-id\") pod \"2760b874-f860-4b57-9cc3-91c3effda0cc\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.612400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data-custom\") pod \"2760b874-f860-4b57-9cc3-91c3effda0cc\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.612492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-combined-ca-bundle\") pod \"2760b874-f860-4b57-9cc3-91c3effda0cc\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.612584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data\") pod \"2760b874-f860-4b57-9cc3-91c3effda0cc\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.625705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2760b874-f860-4b57-9cc3-91c3effda0cc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2760b874-f860-4b57-9cc3-91c3effda0cc" (UID: "2760b874-f860-4b57-9cc3-91c3effda0cc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.628021 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2760b874-f860-4b57-9cc3-91c3effda0cc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.644958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2760b874-f860-4b57-9cc3-91c3effda0cc-kube-api-access-z7w5d" (OuterVolumeSpecName: "kube-api-access-z7w5d") pod "2760b874-f860-4b57-9cc3-91c3effda0cc" (UID: "2760b874-f860-4b57-9cc3-91c3effda0cc"). InnerVolumeSpecName "kube-api-access-z7w5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.647118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-scripts" (OuterVolumeSpecName: "scripts") pod "2760b874-f860-4b57-9cc3-91c3effda0cc" (UID: "2760b874-f860-4b57-9cc3-91c3effda0cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.661244 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2760b874-f860-4b57-9cc3-91c3effda0cc" (UID: "2760b874-f860-4b57-9cc3-91c3effda0cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.728322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data" (OuterVolumeSpecName: "config-data") pod "2760b874-f860-4b57-9cc3-91c3effda0cc" (UID: "2760b874-f860-4b57-9cc3-91c3effda0cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.728814 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-brick\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.728877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-iscsi\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.728908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-machine-id\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.728929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-dev\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.728989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-lib-modules\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.728971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data-custom\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-dev" (OuterVolumeSpecName: "dev") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data\") pod \"2760b874-f860-4b57-9cc3-91c3effda0cc\" (UID: \"2760b874-f860-4b57-9cc3-91c3effda0cc\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: W0218 06:07:07.729122 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2760b874-f860-4b57-9cc3-91c3effda0cc/volumes/kubernetes.io~secret/config-data Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data" (OuterVolumeSpecName: "config-data") pod "2760b874-f860-4b57-9cc3-91c3effda0cc" (UID: "2760b874-f860-4b57-9cc3-91c3effda0cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-sys\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-combined-ca-bundle\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-cinder\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729261 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-sys" (OuterVolumeSpecName: "sys") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnf8\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-kube-api-access-ttnf8\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-scripts\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-ceph\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-run\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-nvme\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.729580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-lib-cinder\") pod \"7e621198-27e7-4dce-aa75-3b16e6658b29\" (UID: \"7e621198-27e7-4dce-aa75-3b16e6658b29\") " Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730298 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7w5d\" (UniqueName: \"kubernetes.io/projected/2760b874-f860-4b57-9cc3-91c3effda0cc-kube-api-access-z7w5d\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730318 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730330 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730340 4707 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730349 4707 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730357 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730366 4707 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-dev\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730375 4707 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730384 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730394 4707 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-sys\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-run" (OuterVolumeSpecName: "run") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.730660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.738936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-kube-api-access-ttnf8" (OuterVolumeSpecName: "kube-api-access-ttnf8") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "kube-api-access-ttnf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.738985 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.739008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-ceph" (OuterVolumeSpecName: "ceph") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.739336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-scripts" (OuterVolumeSpecName: "scripts") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.740936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2760b874-f860-4b57-9cc3-91c3effda0cc" (UID: "2760b874-f860-4b57-9cc3-91c3effda0cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.783937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833578 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760b874-f860-4b57-9cc3-91c3effda0cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833611 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833622 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833630 4707 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833640 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnf8\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-kube-api-access-ttnf8\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833652 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833663 4707 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833672 4707 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e621198-27e7-4dce-aa75-3b16e6658b29-ceph\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833681 4707 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.833689 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7e621198-27e7-4dce-aa75-3b16e6658b29-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.867539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2760b874-f860-4b57-9cc3-91c3effda0cc","Type":"ContainerDied","Data":"1c3d4f5c104fa2389111c75bf6eb56884d8d51ed169fdbee078a2b855a4dc741"} Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.867606 4707 scope.go:117] "RemoveContainer" containerID="e3f69d3a09f1bbc10c2f9d583280700e75e893c70d77b390dece188f4fa0e5ea" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.867787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.869203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data" (OuterVolumeSpecName: "config-data") pod "7e621198-27e7-4dce-aa75-3b16e6658b29" (UID: "7e621198-27e7-4dce-aa75-3b16e6658b29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.885660 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerID="61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c" exitCode=0 Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.885773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7e621198-27e7-4dce-aa75-3b16e6658b29","Type":"ContainerDied","Data":"61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c"} Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.885845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7e621198-27e7-4dce-aa75-3b16e6658b29","Type":"ContainerDied","Data":"c427766e2c8556f225d8f4214c1254f3380bfe63cb1539afd94d2b093ad8cf33"} Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.885925 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.936351 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e621198-27e7-4dce-aa75-3b16e6658b29-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.936979 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:07:07 crc kubenswrapper[4707]: I0218 06:07:07.993297 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.050932 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.091421 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" path="/var/lib/kubelet/pods/2760b874-f860-4b57-9cc3-91c3effda0cc/volumes" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.092277 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" path="/var/lib/kubelet/pods/b2e17b52-2282-4ccb-ba32-06a1d10e6c6f/volumes" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.095346 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:07:08 crc kubenswrapper[4707]: E0218 06:07:08.095757 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-api" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.096085 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-api" Feb 18 06:07:08 crc kubenswrapper[4707]: E0218 06:07:08.096165 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="probe" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.096252 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="probe" Feb 18 06:07:08 crc kubenswrapper[4707]: E0218 06:07:08.096330 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-log" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.096397 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-log" Feb 18 06:07:08 crc kubenswrapper[4707]: E0218 06:07:08.096471 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="cinder-scheduler" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.096535 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="cinder-scheduler" Feb 18 06:07:08 crc kubenswrapper[4707]: E0218 06:07:08.096611 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="probe" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.096684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="probe" Feb 18 06:07:08 crc kubenswrapper[4707]: E0218 06:07:08.096760 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="cinder-backup" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.097027 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="cinder-backup" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.097324 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="probe" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.097392 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2760b874-f860-4b57-9cc3-91c3effda0cc" containerName="cinder-scheduler" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.097466 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="cinder-backup" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.097551 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" containerName="probe" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.097628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-log" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.097710 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e17b52-2282-4ccb-ba32-06a1d10e6c6f" containerName="placement-api" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.098897 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.099250 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.100538 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.101049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.104403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.105296 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.105432 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.112648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.252528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.252857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-scripts\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.252885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-lib-modules\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.252937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-run\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.252956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9tz5\" (UniqueName: \"kubernetes.io/projected/2e24e699-659c-4701-9459-133197b510d7-kube-api-access-b9tz5\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.252991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-config-data\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-dev\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e24e699-659c-4701-9459-133197b510d7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-config-data\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-scripts\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a75b087d-214f-4fed-a30c-d0d4f5607a08-ceph\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptxj\" (UniqueName: \"kubernetes.io/projected/a75b087d-214f-4fed-a30c-d0d4f5607a08-kube-api-access-8ptxj\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.253430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-sys\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355168 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a75b087d-214f-4fed-a30c-d0d4f5607a08-ceph\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptxj\" (UniqueName: \"kubernetes.io/projected/a75b087d-214f-4fed-a30c-d0d4f5607a08-kube-api-access-8ptxj\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-sys\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-lib-modules\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-scripts\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-run\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9tz5\" (UniqueName: \"kubernetes.io/projected/2e24e699-659c-4701-9459-133197b510d7-kube-api-access-b9tz5\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-config-data\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-dev\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e24e699-659c-4701-9459-133197b510d7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-config-data\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-scripts\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.355628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-dev\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e24e699-659c-4701-9459-133197b510d7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-run\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-lib-modules\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-sys\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.356698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.359129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.359940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a75b087d-214f-4fed-a30c-d0d4f5607a08-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.366672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.369821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-scripts\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.371033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.371771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-scripts\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.373060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a75b087d-214f-4fed-a30c-d0d4f5607a08-ceph\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.379951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e24e699-659c-4701-9459-133197b510d7-config-data\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.385686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptxj\" (UniqueName: \"kubernetes.io/projected/a75b087d-214f-4fed-a30c-d0d4f5607a08-kube-api-access-8ptxj\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.389417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.392275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75b087d-214f-4fed-a30c-d0d4f5607a08-config-data\") pod \"cinder-backup-0\" (UID: \"a75b087d-214f-4fed-a30c-d0d4f5607a08\") " pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.399435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9tz5\" (UniqueName: \"kubernetes.io/projected/2e24e699-659c-4701-9459-133197b510d7-kube-api-access-b9tz5\") pod \"cinder-scheduler-0\" (UID: \"2e24e699-659c-4701-9459-133197b510d7\") " pod="openstack/cinder-scheduler-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.448302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 18 06:07:08 crc kubenswrapper[4707]: I0218 06:07:08.456939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 06:07:09 crc kubenswrapper[4707]: I0218 06:07:09.031066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 18 06:07:09 crc kubenswrapper[4707]: I0218 06:07:09.761381 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.067105 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e621198-27e7-4dce-aa75-3b16e6658b29" path="/var/lib/kubelet/pods/7e621198-27e7-4dce-aa75-3b16e6658b29/volumes" Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.254661 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.280062 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.280568 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-log" containerID="cri-o://fd117981ca664788b78f8d24fb317f8453f05eab09bfa2489d6ea42e977fa597" gracePeriod=30 Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.282462 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-httpd" containerID="cri-o://8ba5d40251905a9d76e41bc1b6a66cd86b068bdd7465c2eec88fc3f7a3f5597b" gracePeriod=30 Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.659588 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-fc8b85554-bcs7j" Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.749944 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-654b57f754-5bkfb"] Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.751774 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-654b57f754-5bkfb" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api-log" containerID="cri-o://dcb0a29044cf4492d5e71084b21ad2afdd6d4ea21ff51a564c3b93947cae38ab" gracePeriod=30 Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.751992 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-654b57f754-5bkfb" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api" containerID="cri-o://457c33b156bd0dec7a5bfd7628485b6c0048c0424b4df582de25aa30aee61fe7" gracePeriod=30 Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.987544 4707 generic.go:334] "Generic (PLEG): container finished" podID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerID="fd117981ca664788b78f8d24fb317f8453f05eab09bfa2489d6ea42e977fa597" exitCode=143 Feb 18 06:07:10 crc kubenswrapper[4707]: I0218 06:07:10.987666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9656e2a3-692a-44cb-9260-2b3ae4227e82","Type":"ContainerDied","Data":"fd117981ca664788b78f8d24fb317f8453f05eab09bfa2489d6ea42e977fa597"} Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.015656 4707 generic.go:334] "Generic (PLEG): container finished" podID="872558df-6201-4e66-9c41-d05a120eec8d" containerID="dcb0a29044cf4492d5e71084b21ad2afdd6d4ea21ff51a564c3b93947cae38ab" exitCode=143 Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.016886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-654b57f754-5bkfb" event={"ID":"872558df-6201-4e66-9c41-d05a120eec8d","Type":"ContainerDied","Data":"dcb0a29044cf4492d5e71084b21ad2afdd6d4ea21ff51a564c3b93947cae38ab"} Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.106611 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.163214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.207971 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.311132 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.311431 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-central-agent" containerID="cri-o://1459c6da7524ff6db4d16ac1bdd2cc5e23efac69add9ab6646657dc14ba952d3" gracePeriod=30 Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.311566 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="proxy-httpd" containerID="cri-o://5d656020fd25af66ea614a6b33160e616890c9b3d5f87c69e8c6df3649ff2d5d" gracePeriod=30 Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.311615 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="sg-core" containerID="cri-o://97cc33b24b8809dc241cb83bfc5c96fc0540e9a53361fce3b3d210d015b462ce" gracePeriod=30 Feb 18 06:07:11 crc kubenswrapper[4707]: I0218 06:07:11.311646 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-notification-agent" containerID="cri-o://5e7bfb230e7c8682920b6771fed5d956fda830cca53a74632e3f724b8c71cc00" gracePeriod=30 Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.029102 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerID="5d656020fd25af66ea614a6b33160e616890c9b3d5f87c69e8c6df3649ff2d5d" exitCode=0 Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.029492 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerID="97cc33b24b8809dc241cb83bfc5c96fc0540e9a53361fce3b3d210d015b462ce" exitCode=2 Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.029177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerDied","Data":"5d656020fd25af66ea614a6b33160e616890c9b3d5f87c69e8c6df3649ff2d5d"} Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.029571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerDied","Data":"97cc33b24b8809dc241cb83bfc5c96fc0540e9a53361fce3b3d210d015b462ce"} Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.029605 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerDied","Data":"1459c6da7524ff6db4d16ac1bdd2cc5e23efac69add9ab6646657dc14ba952d3"} Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.029502 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerID="1459c6da7524ff6db4d16ac1bdd2cc5e23efac69add9ab6646657dc14ba952d3" exitCode=0 Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.029979 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="manila-scheduler" containerID="cri-o://ca3a49015971daede027e9d4cd15fc49ae714693732dfae9fc58591cbe490a75" gracePeriod=30 Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.030009 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="probe" containerID="cri-o://655965d1a64e084eae3f847542294bf2a7d235639aa4c827610cd6f459b714d6" gracePeriod=30 Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.836572 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-878756b99-xx5vn"] Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.840465 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.843022 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.843217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.845957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.859096 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-878756b99-xx5vn"] Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-combined-ca-bundle\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-public-tls-certs\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-internal-tls-certs\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpcg7\" (UniqueName: \"kubernetes.io/projected/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-kube-api-access-tpcg7\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-run-httpd\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-log-httpd\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-config-data\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:12 crc kubenswrapper[4707]: I0218 06:07:12.967847 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-etc-swift\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.047216 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerID="5e7bfb230e7c8682920b6771fed5d956fda830cca53a74632e3f724b8c71cc00" exitCode=0 Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.047294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerDied","Data":"5e7bfb230e7c8682920b6771fed5d956fda830cca53a74632e3f724b8c71cc00"} Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.054328 4707 generic.go:334] "Generic (PLEG): container finished" podID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerID="655965d1a64e084eae3f847542294bf2a7d235639aa4c827610cd6f459b714d6" exitCode=0 Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.054402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4976b217-8ee1-4ef9-9ee8-93101252adcb","Type":"ContainerDied","Data":"655965d1a64e084eae3f847542294bf2a7d235639aa4c827610cd6f459b714d6"} Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpcg7\" (UniqueName: \"kubernetes.io/projected/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-kube-api-access-tpcg7\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-run-httpd\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-log-httpd\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-config-data\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-etc-swift\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-combined-ca-bundle\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-public-tls-certs\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.069655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-internal-tls-certs\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.070246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-run-httpd\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.070418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-log-httpd\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.076578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-public-tls-certs\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.078581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-combined-ca-bundle\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.083671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-config-data\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.084011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-internal-tls-certs\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.097370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-etc-swift\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.100867 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpcg7\" (UniqueName: \"kubernetes.io/projected/a6b4c749-b753-42b9-8bc7-fb25121f0ea8-kube-api-access-tpcg7\") pod \"swift-proxy-878756b99-xx5vn\" (UID: \"a6b4c749-b753-42b9-8bc7-fb25121f0ea8\") " pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.170246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.574152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.574628 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-log" containerID="cri-o://774ed4ddbb9a9aa8cb8b1341283c9576f6ec331e3eba7306eea0e6e212f6aa06" gracePeriod=30 Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.574780 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-httpd" containerID="cri-o://77e60f34e134a5e24802740f91dd468fd680904bcd33b2525e04274c39738250" gracePeriod=30 Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.921783 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-654b57f754-5bkfb" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:35952->10.217.0.166:9311: read: connection reset by peer" Feb 18 06:07:13 crc kubenswrapper[4707]: I0218 06:07:13.922517 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-654b57f754-5bkfb" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:35968->10.217.0.166:9311: read: connection reset by peer" Feb 18 06:07:14 crc kubenswrapper[4707]: I0218 06:07:14.085926 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerID="774ed4ddbb9a9aa8cb8b1341283c9576f6ec331e3eba7306eea0e6e212f6aa06" exitCode=143 Feb 18 06:07:14 crc kubenswrapper[4707]: I0218 06:07:14.090911 4707 generic.go:334] "Generic (PLEG): container finished" podID="872558df-6201-4e66-9c41-d05a120eec8d" containerID="457c33b156bd0dec7a5bfd7628485b6c0048c0424b4df582de25aa30aee61fe7" exitCode=0 Feb 18 06:07:14 crc kubenswrapper[4707]: I0218 06:07:14.100410 4707 generic.go:334] "Generic (PLEG): container finished" podID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerID="8ba5d40251905a9d76e41bc1b6a66cd86b068bdd7465c2eec88fc3f7a3f5597b" exitCode=0 Feb 18 06:07:14 crc kubenswrapper[4707]: I0218 06:07:14.103587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2","Type":"ContainerDied","Data":"774ed4ddbb9a9aa8cb8b1341283c9576f6ec331e3eba7306eea0e6e212f6aa06"} Feb 18 06:07:14 crc kubenswrapper[4707]: I0218 06:07:14.103627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-654b57f754-5bkfb" event={"ID":"872558df-6201-4e66-9c41-d05a120eec8d","Type":"ContainerDied","Data":"457c33b156bd0dec7a5bfd7628485b6c0048c0424b4df582de25aa30aee61fe7"} Feb 18 06:07:14 crc kubenswrapper[4707]: I0218 06:07:14.103642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9656e2a3-692a-44cb-9260-2b3ae4227e82","Type":"ContainerDied","Data":"8ba5d40251905a9d76e41bc1b6a66cd86b068bdd7465c2eec88fc3f7a3f5597b"} Feb 18 06:07:15 crc kubenswrapper[4707]: I0218 06:07:15.116436 4707 generic.go:334] "Generic (PLEG): container finished" podID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerID="ca3a49015971daede027e9d4cd15fc49ae714693732dfae9fc58591cbe490a75" exitCode=0 Feb 18 06:07:15 crc kubenswrapper[4707]: I0218 06:07:15.116602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4976b217-8ee1-4ef9-9ee8-93101252adcb","Type":"ContainerDied","Data":"ca3a49015971daede027e9d4cd15fc49ae714693732dfae9fc58591cbe490a75"} Feb 18 06:07:15 crc kubenswrapper[4707]: I0218 06:07:15.289316 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 18 06:07:17 crc kubenswrapper[4707]: I0218 06:07:17.147976 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerID="77e60f34e134a5e24802740f91dd468fd680904bcd33b2525e04274c39738250" exitCode=0 Feb 18 06:07:17 crc kubenswrapper[4707]: I0218 06:07:17.148154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2","Type":"ContainerDied","Data":"77e60f34e134a5e24802740f91dd468fd680904bcd33b2525e04274c39738250"} Feb 18 06:07:17 crc kubenswrapper[4707]: I0218 06:07:17.368047 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-654b57f754-5bkfb" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Feb 18 06:07:17 crc kubenswrapper[4707]: I0218 06:07:17.368360 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-654b57f754-5bkfb" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Feb 18 06:07:18 crc kubenswrapper[4707]: I0218 06:07:18.977378 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ts677"] Feb 18 06:07:18 crc kubenswrapper[4707]: I0218 06:07:18.978877 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:18 crc kubenswrapper[4707]: I0218 06:07:18.994314 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ts677"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.086090 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-r5xft"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.087283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.103323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r5xft"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.113429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279b0e7-ca70-4cfd-92a9-c90f10658f69-operator-scripts\") pod \"nova-api-db-create-ts677\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.113502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96gs\" (UniqueName: \"kubernetes.io/projected/b279b0e7-ca70-4cfd-92a9-c90f10658f69-kube-api-access-r96gs\") pod \"nova-api-db-create-ts677\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.191201 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f78d-account-create-update-tw92l"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.192440 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.197181 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.209615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f78d-account-create-update-tw92l"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.215653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjh27\" (UniqueName: \"kubernetes.io/projected/57e387c5-5644-44e5-9479-901efc3f88e8-kube-api-access-hjh27\") pod \"nova-cell0-db-create-r5xft\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.215705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279b0e7-ca70-4cfd-92a9-c90f10658f69-operator-scripts\") pod \"nova-api-db-create-ts677\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.215743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r96gs\" (UniqueName: \"kubernetes.io/projected/b279b0e7-ca70-4cfd-92a9-c90f10658f69-kube-api-access-r96gs\") pod \"nova-api-db-create-ts677\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.216255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57e387c5-5644-44e5-9479-901efc3f88e8-operator-scripts\") pod \"nova-cell0-db-create-r5xft\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.216603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279b0e7-ca70-4cfd-92a9-c90f10658f69-operator-scripts\") pod \"nova-api-db-create-ts677\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.238739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r96gs\" (UniqueName: \"kubernetes.io/projected/b279b0e7-ca70-4cfd-92a9-c90f10658f69-kube-api-access-r96gs\") pod \"nova-api-db-create-ts677\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.296853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-f5brp"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.298602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.299030 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.306940 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f5brp"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.321553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjh27\" (UniqueName: \"kubernetes.io/projected/57e387c5-5644-44e5-9479-901efc3f88e8-kube-api-access-hjh27\") pod \"nova-cell0-db-create-r5xft\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.321648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9ed69c-f65c-4a44-bf84-f9f911905bce-operator-scripts\") pod \"nova-api-f78d-account-create-update-tw92l\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.321751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pftgx\" (UniqueName: \"kubernetes.io/projected/8b9ed69c-f65c-4a44-bf84-f9f911905bce-kube-api-access-pftgx\") pod \"nova-api-f78d-account-create-update-tw92l\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.321848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57e387c5-5644-44e5-9479-901efc3f88e8-operator-scripts\") pod \"nova-cell0-db-create-r5xft\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.322643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57e387c5-5644-44e5-9479-901efc3f88e8-operator-scripts\") pod \"nova-cell0-db-create-r5xft\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.338659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjh27\" (UniqueName: \"kubernetes.io/projected/57e387c5-5644-44e5-9479-901efc3f88e8-kube-api-access-hjh27\") pod \"nova-cell0-db-create-r5xft\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.396012 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8582-account-create-update-d45wr"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.398081 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.400661 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.406314 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8582-account-create-update-d45wr"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.414208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.423165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9ed69c-f65c-4a44-bf84-f9f911905bce-operator-scripts\") pod \"nova-api-f78d-account-create-update-tw92l\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.423223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m7sl\" (UniqueName: \"kubernetes.io/projected/dd09444c-e4f0-4a3b-afeb-6841e197b017-kube-api-access-2m7sl\") pod \"nova-cell1-db-create-f5brp\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.423273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pftgx\" (UniqueName: \"kubernetes.io/projected/8b9ed69c-f65c-4a44-bf84-f9f911905bce-kube-api-access-pftgx\") pod \"nova-api-f78d-account-create-update-tw92l\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.423294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd09444c-e4f0-4a3b-afeb-6841e197b017-operator-scripts\") pod \"nova-cell1-db-create-f5brp\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.424735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9ed69c-f65c-4a44-bf84-f9f911905bce-operator-scripts\") pod \"nova-api-f78d-account-create-update-tw92l\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.444327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pftgx\" (UniqueName: \"kubernetes.io/projected/8b9ed69c-f65c-4a44-bf84-f9f911905bce-kube-api-access-pftgx\") pod \"nova-api-f78d-account-create-update-tw92l\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.510976 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.525035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/868cb81a-07d6-4515-8129-32c1e5d06ca7-operator-scripts\") pod \"nova-cell0-8582-account-create-update-d45wr\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.525127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m7sl\" (UniqueName: \"kubernetes.io/projected/dd09444c-e4f0-4a3b-afeb-6841e197b017-kube-api-access-2m7sl\") pod \"nova-cell1-db-create-f5brp\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.525216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd09444c-e4f0-4a3b-afeb-6841e197b017-operator-scripts\") pod \"nova-cell1-db-create-f5brp\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.525258 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vc4l\" (UniqueName: \"kubernetes.io/projected/868cb81a-07d6-4515-8129-32c1e5d06ca7-kube-api-access-7vc4l\") pod \"nova-cell0-8582-account-create-update-d45wr\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.525820 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd09444c-e4f0-4a3b-afeb-6841e197b017-operator-scripts\") pod \"nova-cell1-db-create-f5brp\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.546261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m7sl\" (UniqueName: \"kubernetes.io/projected/dd09444c-e4f0-4a3b-afeb-6841e197b017-kube-api-access-2m7sl\") pod \"nova-cell1-db-create-f5brp\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.584934 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f40b-account-create-update-55rmw"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.586098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.588685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.596092 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f40b-account-create-update-55rmw"] Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.641535 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.642002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vc4l\" (UniqueName: \"kubernetes.io/projected/868cb81a-07d6-4515-8129-32c1e5d06ca7-kube-api-access-7vc4l\") pod \"nova-cell0-8582-account-create-update-d45wr\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.642132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/868cb81a-07d6-4515-8129-32c1e5d06ca7-operator-scripts\") pod \"nova-cell0-8582-account-create-update-d45wr\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.643076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/868cb81a-07d6-4515-8129-32c1e5d06ca7-operator-scripts\") pod \"nova-cell0-8582-account-create-update-d45wr\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.660885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vc4l\" (UniqueName: \"kubernetes.io/projected/868cb81a-07d6-4515-8129-32c1e5d06ca7-kube-api-access-7vc4l\") pod \"nova-cell0-8582-account-create-update-d45wr\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.724194 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.767538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c750576-ce13-4f43-a430-80a8d84b7829-operator-scripts\") pod \"nova-cell1-f40b-account-create-update-55rmw\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.769091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdbj8\" (UniqueName: \"kubernetes.io/projected/3c750576-ce13-4f43-a430-80a8d84b7829-kube-api-access-sdbj8\") pod \"nova-cell1-f40b-account-create-update-55rmw\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.871591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdbj8\" (UniqueName: \"kubernetes.io/projected/3c750576-ce13-4f43-a430-80a8d84b7829-kube-api-access-sdbj8\") pod \"nova-cell1-f40b-account-create-update-55rmw\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.871701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c750576-ce13-4f43-a430-80a8d84b7829-operator-scripts\") pod \"nova-cell1-f40b-account-create-update-55rmw\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.873249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c750576-ce13-4f43-a430-80a8d84b7829-operator-scripts\") pod \"nova-cell1-f40b-account-create-update-55rmw\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:19 crc kubenswrapper[4707]: I0218 06:07:19.890187 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdbj8\" (UniqueName: \"kubernetes.io/projected/3c750576-ce13-4f43-a430-80a8d84b7829-kube-api-access-sdbj8\") pod \"nova-cell1-f40b-account-create-update-55rmw\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:20 crc kubenswrapper[4707]: I0218 06:07:20.011876 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:20 crc kubenswrapper[4707]: I0218 06:07:20.934540 4707 scope.go:117] "RemoveContainer" containerID="de6b869af8e0dadae190e774446785585f7a0672557cc8c031aeccc15d8a31e6" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.343241 4707 scope.go:117] "RemoveContainer" containerID="61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.597774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.651897 4707 scope.go:117] "RemoveContainer" containerID="3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.666897 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.708442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-scripts\") pod \"4976b217-8ee1-4ef9-9ee8-93101252adcb\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.708502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4976b217-8ee1-4ef9-9ee8-93101252adcb-etc-machine-id\") pod \"4976b217-8ee1-4ef9-9ee8-93101252adcb\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.708634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data-custom\") pod \"4976b217-8ee1-4ef9-9ee8-93101252adcb\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.708660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pxvl\" (UniqueName: \"kubernetes.io/projected/4976b217-8ee1-4ef9-9ee8-93101252adcb-kube-api-access-9pxvl\") pod \"4976b217-8ee1-4ef9-9ee8-93101252adcb\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.708749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-combined-ca-bundle\") pod \"4976b217-8ee1-4ef9-9ee8-93101252adcb\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.708775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data\") pod \"4976b217-8ee1-4ef9-9ee8-93101252adcb\" (UID: \"4976b217-8ee1-4ef9-9ee8-93101252adcb\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.712472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4976b217-8ee1-4ef9-9ee8-93101252adcb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4976b217-8ee1-4ef9-9ee8-93101252adcb" (UID: "4976b217-8ee1-4ef9-9ee8-93101252adcb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.724186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4976b217-8ee1-4ef9-9ee8-93101252adcb-kube-api-access-9pxvl" (OuterVolumeSpecName: "kube-api-access-9pxvl") pod "4976b217-8ee1-4ef9-9ee8-93101252adcb" (UID: "4976b217-8ee1-4ef9-9ee8-93101252adcb"). InnerVolumeSpecName "kube-api-access-9pxvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.726943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-scripts" (OuterVolumeSpecName: "scripts") pod "4976b217-8ee1-4ef9-9ee8-93101252adcb" (UID: "4976b217-8ee1-4ef9-9ee8-93101252adcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.745953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4976b217-8ee1-4ef9-9ee8-93101252adcb" (UID: "4976b217-8ee1-4ef9-9ee8-93101252adcb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.779039 4707 scope.go:117] "RemoveContainer" containerID="61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c" Feb 18 06:07:21 crc kubenswrapper[4707]: E0218 06:07:21.780400 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c\": container with ID starting with 61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c not found: ID does not exist" containerID="61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.780427 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c"} err="failed to get container status \"61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c\": rpc error: code = NotFound desc = could not find container \"61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c\": container with ID starting with 61ed248ba849c9978b421d5a8034881d0d153dab7e7e14c94b8defee878fe81c not found: ID does not exist" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.780448 4707 scope.go:117] "RemoveContainer" containerID="3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79" Feb 18 06:07:21 crc kubenswrapper[4707]: E0218 06:07:21.781358 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79\": container with ID starting with 3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79 not found: ID does not exist" containerID="3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.781380 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79"} err="failed to get container status \"3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79\": rpc error: code = NotFound desc = could not find container \"3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79\": container with ID starting with 3a46b02214285cc6b3a93b320751aff47a1b7c11e45ae624b32e56f95b71be79 not found: ID does not exist" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.808448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4976b217-8ee1-4ef9-9ee8-93101252adcb" (UID: "4976b217-8ee1-4ef9-9ee8-93101252adcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.816569 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data\") pod \"872558df-6201-4e66-9c41-d05a120eec8d\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.817115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmp8w\" (UniqueName: \"kubernetes.io/projected/872558df-6201-4e66-9c41-d05a120eec8d-kube-api-access-hmp8w\") pod \"872558df-6201-4e66-9c41-d05a120eec8d\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.817235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data-custom\") pod \"872558df-6201-4e66-9c41-d05a120eec8d\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.817438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872558df-6201-4e66-9c41-d05a120eec8d-logs\") pod \"872558df-6201-4e66-9c41-d05a120eec8d\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.817580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-combined-ca-bundle\") pod \"872558df-6201-4e66-9c41-d05a120eec8d\" (UID: \"872558df-6201-4e66-9c41-d05a120eec8d\") " Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.818617 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.818714 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4976b217-8ee1-4ef9-9ee8-93101252adcb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.818779 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.818896 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pxvl\" (UniqueName: \"kubernetes.io/projected/4976b217-8ee1-4ef9-9ee8-93101252adcb-kube-api-access-9pxvl\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.818959 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.827536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872558df-6201-4e66-9c41-d05a120eec8d-logs" (OuterVolumeSpecName: "logs") pod "872558df-6201-4e66-9c41-d05a120eec8d" (UID: "872558df-6201-4e66-9c41-d05a120eec8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.847172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872558df-6201-4e66-9c41-d05a120eec8d-kube-api-access-hmp8w" (OuterVolumeSpecName: "kube-api-access-hmp8w") pod "872558df-6201-4e66-9c41-d05a120eec8d" (UID: "872558df-6201-4e66-9c41-d05a120eec8d"). InnerVolumeSpecName "kube-api-access-hmp8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.860281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "872558df-6201-4e66-9c41-d05a120eec8d" (UID: "872558df-6201-4e66-9c41-d05a120eec8d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.922277 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmp8w\" (UniqueName: \"kubernetes.io/projected/872558df-6201-4e66-9c41-d05a120eec8d-kube-api-access-hmp8w\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.922322 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.922332 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872558df-6201-4e66-9c41-d05a120eec8d-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.948073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data" (OuterVolumeSpecName: "config-data") pod "872558df-6201-4e66-9c41-d05a120eec8d" (UID: "872558df-6201-4e66-9c41-d05a120eec8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.976116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "872558df-6201-4e66-9c41-d05a120eec8d" (UID: "872558df-6201-4e66-9c41-d05a120eec8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:21 crc kubenswrapper[4707]: I0218 06:07:21.987146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data" (OuterVolumeSpecName: "config-data") pod "4976b217-8ee1-4ef9-9ee8-93101252adcb" (UID: "4976b217-8ee1-4ef9-9ee8-93101252adcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.028256 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.028288 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4976b217-8ee1-4ef9-9ee8-93101252adcb-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.028297 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872558df-6201-4e66-9c41-d05a120eec8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.036346 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.175627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-scripts\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.175683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqbg6\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-kube-api-access-tqbg6\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.175841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-public-tls-certs\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.175878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-ceph\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.175925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-httpd-run\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.175954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-config-data\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.175979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-combined-ca-bundle\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.176038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.176132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-logs\") pod \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\" (UID: \"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.177539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-logs" (OuterVolumeSpecName: "logs") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.178594 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.178950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.187100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-kube-api-access-tqbg6" (OuterVolumeSpecName: "kube-api-access-tqbg6") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "kube-api-access-tqbg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.238059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-ceph" (OuterVolumeSpecName: "ceph") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.246162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-scripts" (OuterVolumeSpecName: "scripts") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.277963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.354243 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.354317 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.354345 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqbg6\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-kube-api-access-tqbg6\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.354376 4707 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-ceph\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.354403 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.354463 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.386568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.400227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4976b217-8ee1-4ef9-9ee8-93101252adcb","Type":"ContainerDied","Data":"4faa6cdec630374a56c6eb65602b120242f97212a5b7f7fa4589aab918d1fafa"} Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.400385 4707 scope.go:117] "RemoveContainer" containerID="655965d1a64e084eae3f847542294bf2a7d235639aa4c827610cd6f459b714d6" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.400608 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.422298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"73dac699-5199-47dc-b173-8df7813c1ad4","Type":"ContainerStarted","Data":"9ad7614b71347632e4855d932ae967bcdef13571e45b18f11d613fd1c2ca0366"} Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.444113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2","Type":"ContainerDied","Data":"3f5b48772cc4ee6ef3792d3b3e3bd66ce34dfa027fc252a44738ef41b381da55"} Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.444366 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.455712 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.035366956 podStartE2EDuration="19.455690269s" podCreationTimestamp="2026-02-18 06:07:03 +0000 UTC" firstStartedPulling="2026-02-18 06:07:04.882451361 +0000 UTC m=+1161.530410495" lastFinishedPulling="2026-02-18 06:07:21.302774674 +0000 UTC m=+1177.950733808" observedRunningTime="2026-02-18 06:07:22.45463837 +0000 UTC m=+1179.102597504" watchObservedRunningTime="2026-02-18 06:07:22.455690269 +0000 UTC m=+1179.103649403" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.468585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-log-httpd\") pod \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.468701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vjxl\" (UniqueName: \"kubernetes.io/projected/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-kube-api-access-6vjxl\") pod \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.468773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-config-data\") pod \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.469068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-run-httpd\") pod \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.469105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-combined-ca-bundle\") pod \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.469148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-sg-core-conf-yaml\") pod \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.469253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-scripts\") pod \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\" (UID: \"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.470537 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.492631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" (UID: "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.494585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-scripts" (OuterVolumeSpecName: "scripts") pod "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" (UID: "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.497532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-kube-api-access-6vjxl" (OuterVolumeSpecName: "kube-api-access-6vjxl") pod "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" (UID: "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66"). InnerVolumeSpecName "kube-api-access-6vjxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.499693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" (UID: "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.504458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-config-data" (OuterVolumeSpecName: "config-data") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.513212 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6998659dbc-vmh65" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.525588 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" (UID: "6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.528921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66","Type":"ContainerDied","Data":"763716bedb4610f516d7e19c0b7335252280a19ef31c66a97ab6569d7334a819"} Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.530962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.559552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-654b57f754-5bkfb" event={"ID":"872558df-6201-4e66-9c41-d05a120eec8d","Type":"ContainerDied","Data":"21e96fb18f5bd6ec3e23be5651c54d38dd60d745973387647b48b71c83054e5b"} Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.559724 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-654b57f754-5bkfb" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.590462 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.591433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" (UID: "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.596648 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59d57d6969-cr5bz"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.596914 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59d57d6969-cr5bz" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-api" containerID="cri-o://51a6b225dae50b85cc67ae1341f9aa5bed51407265f983bd797477bf072586d4" gracePeriod=30 Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.597054 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59d57d6969-cr5bz" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-httpd" containerID="cri-o://dd39673696c1c71348820e2f471078d3a0a78f235d0d501e6988b7a84c1af716" gracePeriod=30 Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598127 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598144 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598155 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598165 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598175 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vjxl\" (UniqueName: \"kubernetes.io/projected/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-kube-api-access-6vjxl\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598184 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598192 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.598201 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.675656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" (UID: "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.684441 4707 scope.go:117] "RemoveContainer" containerID="ca3a49015971daede027e9d4cd15fc49ae714693732dfae9fc58591cbe490a75" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.686761 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.701636 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.705413 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.746330 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.776307 4707 scope.go:117] "RemoveContainer" containerID="77e60f34e134a5e24802740f91dd468fd680904bcd33b2525e04274c39738250" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.781735 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782210 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="probe" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782227 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="probe" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api-log" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782246 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api-log" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782257 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782268 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782279 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-central-agent" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782287 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-central-agent" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782300 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="sg-core" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="sg-core" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782320 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-log" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-log" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782339 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-log" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782344 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-log" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-notification-agent" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-notification-agent" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782373 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782378 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782391 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="manila-scheduler" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782398 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="manila-scheduler" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782420 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782426 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: E0218 06:07:22.782435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="proxy-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782441 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="proxy-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782644 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-notification-agent" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-log" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782686 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782696 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782706 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="proxy-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782720 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" containerName="glance-log" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" containerName="glance-httpd" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782740 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="ceilometer-central-agent" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" containerName="sg-core" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782754 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="probe" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782761 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" containerName="manila-scheduler" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.782770 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="872558df-6201-4e66-9c41-d05a120eec8d" containerName="barbican-api-log" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.785565 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.790110 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.801534 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.802852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-logs\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgkd7\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-kube-api-access-pgkd7\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-ceph\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-httpd-run\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-scripts\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-combined-ca-bundle\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-config-data\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.803306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-internal-tls-certs\") pod \"9656e2a3-692a-44cb-9260-2b3ae4227e82\" (UID: \"9656e2a3-692a-44cb-9260-2b3ae4227e82\") " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.804983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-logs" (OuterVolumeSpecName: "logs") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.805256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.824007 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-654b57f754-5bkfb"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.843127 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-654b57f754-5bkfb"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.844212 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.868211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-ceph" (OuterVolumeSpecName: "ceph") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.868556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-kube-api-access-pgkd7" (OuterVolumeSpecName: "kube-api-access-pgkd7") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "kube-api-access-pgkd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.870904 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.871990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-scripts" (OuterVolumeSpecName: "scripts") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.883220 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.906341 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.906788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmdq\" (UniqueName: \"kubernetes.io/projected/04bdc22d-7e6e-428b-849a-45c041654404-kube-api-access-snmdq\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.906931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-scripts\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.906969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bdc22d-7e6e-428b-849a-45c041654404-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-config-data\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907129 4707 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-ceph\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907162 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907176 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907187 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907195 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9656e2a3-692a-44cb-9260-2b3ae4227e82-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.907204 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgkd7\" (UniqueName: \"kubernetes.io/projected/9656e2a3-692a-44cb-9260-2b3ae4227e82-kube-api-access-pgkd7\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.908117 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.910431 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.910834 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.918653 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.963416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-config-data" (OuterVolumeSpecName: "config-data") pod "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" (UID: "0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.965582 4707 scope.go:117] "RemoveContainer" containerID="774ed4ddbb9a9aa8cb8b1341283c9576f6ec331e3eba7306eea0e6e212f6aa06" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.965957 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.972213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.978914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:22 crc kubenswrapper[4707]: I0218 06:07:22.990905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-config-data" (OuterVolumeSpecName: "config-data") pod "9656e2a3-692a-44cb-9260-2b3ae4227e82" (UID: "9656e2a3-692a-44cb-9260-2b3ae4227e82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.003548 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ts677"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.006333 4707 scope.go:117] "RemoveContainer" containerID="5d656020fd25af66ea614a6b33160e616890c9b3d5f87c69e8c6df3649ff2d5d" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.008646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.008688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.008909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-ceph\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-scripts\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bdc22d-7e6e-428b-849a-45c041654404-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-config-data\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-logs\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqcqk\" (UniqueName: \"kubernetes.io/projected/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-kube-api-access-tqcqk\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmdq\" (UniqueName: \"kubernetes.io/projected/04bdc22d-7e6e-428b-849a-45c041654404-kube-api-access-snmdq\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009389 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009402 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009410 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009418 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9656e2a3-692a-44cb-9260-2b3ae4227e82-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009429 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.009751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bdc22d-7e6e-428b-849a-45c041654404-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.018279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.021424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-config-data\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.022061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.024321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bdc22d-7e6e-428b-849a-45c041654404-scripts\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.031367 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmdq\" (UniqueName: \"kubernetes.io/projected/04bdc22d-7e6e-428b-849a-45c041654404-kube-api-access-snmdq\") pod \"manila-scheduler-0\" (UID: \"04bdc22d-7e6e-428b-849a-45c041654404\") " pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.069691 4707 scope.go:117] "RemoveContainer" containerID="97cc33b24b8809dc241cb83bfc5c96fc0540e9a53361fce3b3d210d015b462ce" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.073322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8582-account-create-update-d45wr"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.111552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.112609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-logs\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.112763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.112899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqcqk\" (UniqueName: \"kubernetes.io/projected/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-kube-api-access-tqcqk\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.113169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.113279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.113419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-ceph\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.113526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.115100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.116724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-logs\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.114561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.119354 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.122266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.127243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.135485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.136565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-ceph\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.144347 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.151408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqcqk\" (UniqueName: \"kubernetes.io/projected/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-kube-api-access-tqcqk\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.151522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.173665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b\") " pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.177435 4707 scope.go:117] "RemoveContainer" containerID="5e7bfb230e7c8682920b6771fed5d956fda830cca53a74632e3f724b8c71cc00" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.194539 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f78d-account-create-update-tw92l"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.239822 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f5brp"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.265182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r5xft"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.273645 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.327120 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.331466 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.381631 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f40b-account-create-update-55rmw"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.420282 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.461302 4707 scope.go:117] "RemoveContainer" containerID="1459c6da7524ff6db4d16ac1bdd2cc5e23efac69add9ab6646657dc14ba952d3" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.461726 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.468109 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.473868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.474102 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.481830 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.492386 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.531071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-config-data\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.531117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-log-httpd\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.531172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-run-httpd\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.531200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.531253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx5mx\" (UniqueName: \"kubernetes.io/projected/273cacd3-33db-4d08-ba70-69a5da859506-kube-api-access-jx5mx\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.531330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.531369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-scripts\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.541814 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.615718 4707 generic.go:334] "Generic (PLEG): container finished" podID="fe56b299-664f-478b-9f30-8e2a4c457676" containerID="29a0f7ee99a161c9430446c37f0a44e132007f8c079b432ace3e5f0964a9f2cc" exitCode=137 Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.615895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584c97fdd8-f4pbz" event={"ID":"fe56b299-664f-478b-9f30-8e2a4c457676","Type":"ContainerDied","Data":"29a0f7ee99a161c9430446c37f0a44e132007f8c079b432ace3e5f0964a9f2cc"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.632906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-run-httpd\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.632943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.632996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx5mx\" (UniqueName: \"kubernetes.io/projected/273cacd3-33db-4d08-ba70-69a5da859506-kube-api-access-jx5mx\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.633065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.633098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-scripts\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.633139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-config-data\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.633177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-log-httpd\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.634957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-run-httpd\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.636916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-log-httpd\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.638410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.641815 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.642367 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-scripts\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.643259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-config-data\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.644516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f5brp" event={"ID":"dd09444c-e4f0-4a3b-afeb-6841e197b017","Type":"ContainerStarted","Data":"4baadb0ba1981b41d1cf36a4e521a898fdad1aa423d7388bbbe56b69b356b634"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.653902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a75b087d-214f-4fed-a30c-d0d4f5607a08","Type":"ContainerStarted","Data":"005d0ebecab8f9c5f33dc47ba7705ee32583e84ea3d0a08f28103e153b4d3d89"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.656973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx5mx\" (UniqueName: \"kubernetes.io/projected/273cacd3-33db-4d08-ba70-69a5da859506-kube-api-access-jx5mx\") pod \"ceilometer-0\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " pod="openstack/ceilometer-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.669920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8582-account-create-update-d45wr" event={"ID":"868cb81a-07d6-4515-8129-32c1e5d06ca7","Type":"ContainerStarted","Data":"f4603b6e6da8c818bfd2cfee912b69c6d0ff02b25f2c53c0d08338801a6462f2"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.707436 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.707778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9656e2a3-692a-44cb-9260-2b3ae4227e82","Type":"ContainerDied","Data":"0cf9b2c7f9e2b80088d9959c94b1db4dcd925375e000ae341e55d92acdbb6468"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.724128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r5xft" event={"ID":"57e387c5-5644-44e5-9479-901efc3f88e8","Type":"ContainerStarted","Data":"4b9b03a2f4fe28648b74618138f0a3c164f926abdd025722db780e04cce621bf"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.724226 4707 scope.go:117] "RemoveContainer" containerID="457c33b156bd0dec7a5bfd7628485b6c0048c0424b4df582de25aa30aee61fe7" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.753086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2e24e699-659c-4701-9459-133197b510d7","Type":"ContainerStarted","Data":"2af9aeda4d15f76c9ef4d2c5fb365cdbb4df26ebf9d2a0eb6c664abfb4e8c27c"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.774003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f40b-account-create-update-55rmw" event={"ID":"3c750576-ce13-4f43-a430-80a8d84b7829","Type":"ContainerStarted","Data":"4fc9567e911aa0ce5bf62a4ab132721c02a6427a57cc087217d9fe59b319e973"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.801432 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.814963 4707 generic.go:334] "Generic (PLEG): container finished" podID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerID="dd39673696c1c71348820e2f471078d3a0a78f235d0d501e6988b7a84c1af716" exitCode=0 Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.815076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d57d6969-cr5bz" event={"ID":"eada398c-c6c5-4cbe-b221-1c3461fa8cd8","Type":"ContainerDied","Data":"dd39673696c1c71348820e2f471078d3a0a78f235d0d501e6988b7a84c1af716"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.832035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f78d-account-create-update-tw92l" event={"ID":"8b9ed69c-f65c-4a44-bf84-f9f911905bce","Type":"ContainerStarted","Data":"5c7cc078e55a98950cc9b468df14a38ddd60c9f906bba1c0545a819b20b69432"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.841395 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.849504 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.851471 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.855378 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.855660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.860439 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.862564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ts677" event={"ID":"b279b0e7-ca70-4cfd-92a9-c90f10658f69","Type":"ContainerStarted","Data":"68bb1a9e25934853def1e7ee5313d626d0fb2cd4e622755f4b5b44ce30d4b6b6"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.885039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"934db458-63c7-4b69-b4bf-70532ff71312","Type":"ContainerStarted","Data":"d16f09469b1584709ceedf9b9b8c956a317bde44a8320eac2be2750d1b527147"} Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.915551 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-ts677" podStartSLOduration=5.915527819 podStartE2EDuration="5.915527819s" podCreationTimestamp="2026-02-18 06:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:23.901360031 +0000 UTC m=+1180.549319165" watchObservedRunningTime="2026-02-18 06:07:23.915527819 +0000 UTC m=+1180.563486953" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8582\" (UniqueName: \"kubernetes.io/projected/36b54f53-a798-4b8c-99ab-773ba732530b-kube-api-access-s8582\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36b54f53-a798-4b8c-99ab-773ba732530b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36b54f53-a798-4b8c-99ab-773ba732530b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.940953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b54f53-a798-4b8c-99ab-773ba732530b-logs\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:23 crc kubenswrapper[4707]: I0218 06:07:23.942761 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.008697 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-878756b99-xx5vn"] Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.043726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.043821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.043919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.043950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b54f53-a798-4b8c-99ab-773ba732530b-logs\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.044097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.044154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8582\" (UniqueName: \"kubernetes.io/projected/36b54f53-a798-4b8c-99ab-773ba732530b-kube-api-access-s8582\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.044203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36b54f53-a798-4b8c-99ab-773ba732530b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.044250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.044274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36b54f53-a798-4b8c-99ab-773ba732530b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.044471 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.056916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b54f53-a798-4b8c-99ab-773ba732530b-logs\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.058188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36b54f53-a798-4b8c-99ab-773ba732530b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.058739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.065731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36b54f53-a798-4b8c-99ab-773ba732530b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.067173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.067470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.075835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b54f53-a798-4b8c-99ab-773ba732530b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.083300 4707 scope.go:117] "RemoveContainer" containerID="dcb0a29044cf4492d5e71084b21ad2afdd6d4ea21ff51a564c3b93947cae38ab" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.096036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8582\" (UniqueName: \"kubernetes.io/projected/36b54f53-a798-4b8c-99ab-773ba732530b-kube-api-access-s8582\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.110928 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66" path="/var/lib/kubelet/pods/0a24a9ae-2d01-4cd6-94ed-ac92a17a4a66/volumes" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.112091 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4976b217-8ee1-4ef9-9ee8-93101252adcb" path="/var/lib/kubelet/pods/4976b217-8ee1-4ef9-9ee8-93101252adcb/volumes" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.120957 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2" path="/var/lib/kubelet/pods/6bdd71a6-ccc7-4f6a-86bb-5c11eb6f78d2/volumes" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.138098 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872558df-6201-4e66-9c41-d05a120eec8d" path="/var/lib/kubelet/pods/872558df-6201-4e66-9c41-d05a120eec8d/volumes" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.149520 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9656e2a3-692a-44cb-9260-2b3ae4227e82" path="/var/lib/kubelet/pods/9656e2a3-692a-44cb-9260-2b3ae4227e82/volumes" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.150866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.372949 4707 scope.go:117] "RemoveContainer" containerID="8ba5d40251905a9d76e41bc1b6a66cd86b068bdd7465c2eec88fc3f7a3f5597b" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.376462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"36b54f53-a798-4b8c-99ab-773ba732530b\") " pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.444182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.601980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.620296 4707 scope.go:117] "RemoveContainer" containerID="fd117981ca664788b78f8d24fb317f8453f05eab09bfa2489d6ea42e977fa597" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.705486 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.800130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-combined-ca-bundle\") pod \"fe56b299-664f-478b-9f30-8e2a4c457676\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.800943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-config-data\") pod \"fe56b299-664f-478b-9f30-8e2a4c457676\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.801129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-secret-key\") pod \"fe56b299-664f-478b-9f30-8e2a4c457676\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.801344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-scripts\") pod \"fe56b299-664f-478b-9f30-8e2a4c457676\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.801479 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-tls-certs\") pod \"fe56b299-664f-478b-9f30-8e2a4c457676\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.801633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q8ft\" (UniqueName: \"kubernetes.io/projected/fe56b299-664f-478b-9f30-8e2a4c457676-kube-api-access-9q8ft\") pod \"fe56b299-664f-478b-9f30-8e2a4c457676\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.801774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe56b299-664f-478b-9f30-8e2a4c457676-logs\") pod \"fe56b299-664f-478b-9f30-8e2a4c457676\" (UID: \"fe56b299-664f-478b-9f30-8e2a4c457676\") " Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.803199 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe56b299-664f-478b-9f30-8e2a4c457676-logs" (OuterVolumeSpecName: "logs") pod "fe56b299-664f-478b-9f30-8e2a4c457676" (UID: "fe56b299-664f-478b-9f30-8e2a4c457676"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.877385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fe56b299-664f-478b-9f30-8e2a4c457676" (UID: "fe56b299-664f-478b-9f30-8e2a4c457676"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.883709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe56b299-664f-478b-9f30-8e2a4c457676-kube-api-access-9q8ft" (OuterVolumeSpecName: "kube-api-access-9q8ft") pod "fe56b299-664f-478b-9f30-8e2a4c457676" (UID: "fe56b299-664f-478b-9f30-8e2a4c457676"). InnerVolumeSpecName "kube-api-access-9q8ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.904176 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q8ft\" (UniqueName: \"kubernetes.io/projected/fe56b299-664f-478b-9f30-8e2a4c457676-kube-api-access-9q8ft\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.904201 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe56b299-664f-478b-9f30-8e2a4c457676-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.904213 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.923145 4707 generic.go:334] "Generic (PLEG): container finished" podID="dd09444c-e4f0-4a3b-afeb-6841e197b017" containerID="3d4269a3dd801c28cf9dceeee0566fb80237c282e501ec50aa6498dd38d90b4b" exitCode=0 Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.923358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f5brp" event={"ID":"dd09444c-e4f0-4a3b-afeb-6841e197b017","Type":"ContainerDied","Data":"3d4269a3dd801c28cf9dceeee0566fb80237c282e501ec50aa6498dd38d90b4b"} Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.930928 4707 generic.go:334] "Generic (PLEG): container finished" podID="868cb81a-07d6-4515-8129-32c1e5d06ca7" containerID="add97642b7ba0a4cf22007e71c4dc74213e686f33d7b6829ec86917b16faab4a" exitCode=0 Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.931023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8582-account-create-update-d45wr" event={"ID":"868cb81a-07d6-4515-8129-32c1e5d06ca7","Type":"ContainerDied","Data":"add97642b7ba0a4cf22007e71c4dc74213e686f33d7b6829ec86917b16faab4a"} Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.952077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"934db458-63c7-4b69-b4bf-70532ff71312","Type":"ContainerStarted","Data":"c4c3512023062e1056a44c13ba76099910ee8d3d50aeb7bb1b5dbeccb1c0066a"} Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.982612 4707 generic.go:334] "Generic (PLEG): container finished" podID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerID="51a6b225dae50b85cc67ae1341f9aa5bed51407265f983bd797477bf072586d4" exitCode=0 Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.982971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d57d6969-cr5bz" event={"ID":"eada398c-c6c5-4cbe-b221-1c3461fa8cd8","Type":"ContainerDied","Data":"51a6b225dae50b85cc67ae1341f9aa5bed51407265f983bd797477bf072586d4"} Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.983002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d57d6969-cr5bz" event={"ID":"eada398c-c6c5-4cbe-b221-1c3461fa8cd8","Type":"ContainerDied","Data":"46b18c9a9a20ed28c742f389d1f7de58b7a33d5ca5ec209c8e73d3c052cd0a55"} Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.983013 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b18c9a9a20ed28c742f389d1f7de58b7a33d5ca5ec209c8e73d3c052cd0a55" Feb 18 06:07:24 crc kubenswrapper[4707]: I0218 06:07:24.986962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.021344 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=7.792342104 podStartE2EDuration="37.021317845s" podCreationTimestamp="2026-02-18 06:06:48 +0000 UTC" firstStartedPulling="2026-02-18 06:06:52.036136927 +0000 UTC m=+1148.684096061" lastFinishedPulling="2026-02-18 06:07:21.265112668 +0000 UTC m=+1177.913071802" observedRunningTime="2026-02-18 06:07:24.980192658 +0000 UTC m=+1181.628151812" watchObservedRunningTime="2026-02-18 06:07:25.021317845 +0000 UTC m=+1181.669276979" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.063734 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.084107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04bdc22d-7e6e-428b-849a-45c041654404","Type":"ContainerStarted","Data":"da5ba5370268c1b9505118bbd0900d8b8f8b3fa24e3189e743c533cf20835d7a"} Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.090233 4707 generic.go:334] "Generic (PLEG): container finished" podID="b279b0e7-ca70-4cfd-92a9-c90f10658f69" containerID="6a4e495c468efb5e133c7e8317d4ae4053e8d18a7c9e204f0a081d8d8e68e244" exitCode=0 Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.091386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ts677" event={"ID":"b279b0e7-ca70-4cfd-92a9-c90f10658f69","Type":"ContainerDied","Data":"6a4e495c468efb5e133c7e8317d4ae4053e8d18a7c9e204f0a081d8d8e68e244"} Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.105959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b","Type":"ContainerStarted","Data":"aa0945dba65596fd24d538cdf2af731a65253f6e6bda01213ef93875fbac41f9"} Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.115016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-httpd-config\") pod \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.115085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjqrw\" (UniqueName: \"kubernetes.io/projected/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-kube-api-access-qjqrw\") pod \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.115106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-ovndb-tls-certs\") pod \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.115130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-config\") pod \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.115600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-combined-ca-bundle\") pod \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.115659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-internal-tls-certs\") pod \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.115702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-public-tls-certs\") pod \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\" (UID: \"eada398c-c6c5-4cbe-b221-1c3461fa8cd8\") " Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.117902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-878756b99-xx5vn" event={"ID":"a6b4c749-b753-42b9-8bc7-fb25121f0ea8","Type":"ContainerStarted","Data":"32e5c81d018fee4891a5f5d9f63c41b49ee8716bcd6ae6f80875cdd77509e22f"} Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.123766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584c97fdd8-f4pbz" event={"ID":"fe56b299-664f-478b-9f30-8e2a4c457676","Type":"ContainerDied","Data":"328aa4e2b64435ab36f682f1e872ec7c28fcc7d0b47c80ed7774a9407077eba1"} Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.123842 4707 scope.go:117] "RemoveContainer" containerID="22c76c3f5497afbf94a962ad1ce73ddde6f0928e5b98bac214feb82c9ced4db8" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.123879 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584c97fdd8-f4pbz" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.149452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-kube-api-access-qjqrw" (OuterVolumeSpecName: "kube-api-access-qjqrw") pod "eada398c-c6c5-4cbe-b221-1c3461fa8cd8" (UID: "eada398c-c6c5-4cbe-b221-1c3461fa8cd8"). InnerVolumeSpecName "kube-api-access-qjqrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.163012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eada398c-c6c5-4cbe-b221-1c3461fa8cd8" (UID: "eada398c-c6c5-4cbe-b221-1c3461fa8cd8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.164215 4707 generic.go:334] "Generic (PLEG): container finished" podID="8b9ed69c-f65c-4a44-bf84-f9f911905bce" containerID="2100f4326fde2edef8d66ac7a2809ddcb820e8c0e95af02e9cdae51b0602aa65" exitCode=0 Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.164338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f78d-account-create-update-tw92l" event={"ID":"8b9ed69c-f65c-4a44-bf84-f9f911905bce","Type":"ContainerDied","Data":"2100f4326fde2edef8d66ac7a2809ddcb820e8c0e95af02e9cdae51b0602aa65"} Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.219165 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.219676 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjqrw\" (UniqueName: \"kubernetes.io/projected/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-kube-api-access-qjqrw\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.400054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-scripts" (OuterVolumeSpecName: "scripts") pod "fe56b299-664f-478b-9f30-8e2a4c457676" (UID: "fe56b299-664f-478b-9f30-8e2a4c457676"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.423399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-config-data" (OuterVolumeSpecName: "config-data") pod "fe56b299-664f-478b-9f30-8e2a4c457676" (UID: "fe56b299-664f-478b-9f30-8e2a4c457676"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.426762 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.426788 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe56b299-664f-478b-9f30-8e2a4c457676-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.441642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe56b299-664f-478b-9f30-8e2a4c457676" (UID: "fe56b299-664f-478b-9f30-8e2a4c457676"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.469883 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eada398c-c6c5-4cbe-b221-1c3461fa8cd8" (UID: "eada398c-c6c5-4cbe-b221-1c3461fa8cd8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.501301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eada398c-c6c5-4cbe-b221-1c3461fa8cd8" (UID: "eada398c-c6c5-4cbe-b221-1c3461fa8cd8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.522175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "fe56b299-664f-478b-9f30-8e2a4c457676" (UID: "fe56b299-664f-478b-9f30-8e2a4c457676"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.525168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.554393 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.554573 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.554637 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.554710 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe56b299-664f-478b-9f30-8e2a4c457676-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.569867 4707 scope.go:117] "RemoveContainer" containerID="29a0f7ee99a161c9430446c37f0a44e132007f8c079b432ace3e5f0964a9f2cc" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.598045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eada398c-c6c5-4cbe-b221-1c3461fa8cd8" (UID: "eada398c-c6c5-4cbe-b221-1c3461fa8cd8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.658965 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.669094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eada398c-c6c5-4cbe-b221-1c3461fa8cd8" (UID: "eada398c-c6c5-4cbe-b221-1c3461fa8cd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.671825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-config" (OuterVolumeSpecName: "config") pod "eada398c-c6c5-4cbe-b221-1c3461fa8cd8" (UID: "eada398c-c6c5-4cbe-b221-1c3461fa8cd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.760549 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.760578 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eada398c-c6c5-4cbe-b221-1c3461fa8cd8-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.936955 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-584c97fdd8-f4pbz"] Feb 18 06:07:25 crc kubenswrapper[4707]: I0218 06:07:25.956446 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-584c97fdd8-f4pbz"] Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.071118 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" path="/var/lib/kubelet/pods/fe56b299-664f-478b-9f30-8e2a4c457676/volumes" Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.213704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-878756b99-xx5vn" event={"ID":"a6b4c749-b753-42b9-8bc7-fb25121f0ea8","Type":"ContainerStarted","Data":"f08fa3210db72a2fb9cad4160c6b2b8fda1f2c87bdaed741f90dce52b5da9030"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.214233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-878756b99-xx5vn" event={"ID":"a6b4c749-b753-42b9-8bc7-fb25121f0ea8","Type":"ContainerStarted","Data":"cb6f15b1775441ebcd94d87d666b3c913026e05d050377563764124c80303230"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.215555 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.215572 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.220123 4707 generic.go:334] "Generic (PLEG): container finished" podID="57e387c5-5644-44e5-9479-901efc3f88e8" containerID="683c1f6417d107a510ab78429d2e1fb2aade6e86dc4a7ef8a7efc21f5a865e0c" exitCode=0 Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.220234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r5xft" event={"ID":"57e387c5-5644-44e5-9479-901efc3f88e8","Type":"ContainerDied","Data":"683c1f6417d107a510ab78429d2e1fb2aade6e86dc4a7ef8a7efc21f5a865e0c"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.225685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerStarted","Data":"24a42d126172ae127f2d7d26c0f68bf90dd7972ea3a8b2ed4644bc54f5e2b8ed"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.227939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"36b54f53-a798-4b8c-99ab-773ba732530b","Type":"ContainerStarted","Data":"11e65aa68f01ecbb776a479ec280c49021769376ec1e5ee8a9b6775974e233b9"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.231935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b","Type":"ContainerStarted","Data":"215a21ef79732ce441c079935e2eadba5f276965f6c43ee1a1465215891d2f3b"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.249399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2e24e699-659c-4701-9459-133197b510d7","Type":"ContainerStarted","Data":"b9aeb637f156503ed3bd1593d86dad524a5b35314b84b3449b52af704879a29a"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.253594 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c750576-ce13-4f43-a430-80a8d84b7829" containerID="b58071643fc0e14e0134e48024690ff550ef5996252cb9bc0688b929107e83ea" exitCode=0 Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.253693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f40b-account-create-update-55rmw" event={"ID":"3c750576-ce13-4f43-a430-80a8d84b7829","Type":"ContainerDied","Data":"b58071643fc0e14e0134e48024690ff550ef5996252cb9bc0688b929107e83ea"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.256697 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-878756b99-xx5vn" podStartSLOduration=14.256676763 podStartE2EDuration="14.256676763s" podCreationTimestamp="2026-02-18 06:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:26.237982443 +0000 UTC m=+1182.885941577" watchObservedRunningTime="2026-02-18 06:07:26.256676763 +0000 UTC m=+1182.904635897" Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.297262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a75b087d-214f-4fed-a30c-d0d4f5607a08","Type":"ContainerStarted","Data":"43129cdc88cd67538582f437388ffcf948a021376dbdfe629ccc4b847b75fbce"} Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.300219 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d57d6969-cr5bz" Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.526994 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59d57d6969-cr5bz"] Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.547680 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-59d57d6969-cr5bz"] Feb 18 06:07:26 crc kubenswrapper[4707]: I0218 06:07:26.866160 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.002314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vc4l\" (UniqueName: \"kubernetes.io/projected/868cb81a-07d6-4515-8129-32c1e5d06ca7-kube-api-access-7vc4l\") pod \"868cb81a-07d6-4515-8129-32c1e5d06ca7\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.003254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/868cb81a-07d6-4515-8129-32c1e5d06ca7-operator-scripts\") pod \"868cb81a-07d6-4515-8129-32c1e5d06ca7\" (UID: \"868cb81a-07d6-4515-8129-32c1e5d06ca7\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.004352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868cb81a-07d6-4515-8129-32c1e5d06ca7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "868cb81a-07d6-4515-8129-32c1e5d06ca7" (UID: "868cb81a-07d6-4515-8129-32c1e5d06ca7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.009976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868cb81a-07d6-4515-8129-32c1e5d06ca7-kube-api-access-7vc4l" (OuterVolumeSpecName: "kube-api-access-7vc4l") pod "868cb81a-07d6-4515-8129-32c1e5d06ca7" (UID: "868cb81a-07d6-4515-8129-32c1e5d06ca7"). InnerVolumeSpecName "kube-api-access-7vc4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.106594 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/868cb81a-07d6-4515-8129-32c1e5d06ca7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.106638 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vc4l\" (UniqueName: \"kubernetes.io/projected/868cb81a-07d6-4515-8129-32c1e5d06ca7-kube-api-access-7vc4l\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.245992 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.250536 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.312945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pftgx\" (UniqueName: \"kubernetes.io/projected/8b9ed69c-f65c-4a44-bf84-f9f911905bce-kube-api-access-pftgx\") pod \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.313137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9ed69c-f65c-4a44-bf84-f9f911905bce-operator-scripts\") pod \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\" (UID: \"8b9ed69c-f65c-4a44-bf84-f9f911905bce\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.313249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r96gs\" (UniqueName: \"kubernetes.io/projected/b279b0e7-ca70-4cfd-92a9-c90f10658f69-kube-api-access-r96gs\") pod \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.313283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279b0e7-ca70-4cfd-92a9-c90f10658f69-operator-scripts\") pod \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\" (UID: \"b279b0e7-ca70-4cfd-92a9-c90f10658f69\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.314367 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9ed69c-f65c-4a44-bf84-f9f911905bce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b9ed69c-f65c-4a44-bf84-f9f911905bce" (UID: "8b9ed69c-f65c-4a44-bf84-f9f911905bce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.316758 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9ed69c-f65c-4a44-bf84-f9f911905bce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.320104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b279b0e7-ca70-4cfd-92a9-c90f10658f69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b279b0e7-ca70-4cfd-92a9-c90f10658f69" (UID: "b279b0e7-ca70-4cfd-92a9-c90f10658f69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.329072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9ed69c-f65c-4a44-bf84-f9f911905bce-kube-api-access-pftgx" (OuterVolumeSpecName: "kube-api-access-pftgx") pod "8b9ed69c-f65c-4a44-bf84-f9f911905bce" (UID: "8b9ed69c-f65c-4a44-bf84-f9f911905bce"). InnerVolumeSpecName "kube-api-access-pftgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.342727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"36b54f53-a798-4b8c-99ab-773ba732530b","Type":"ContainerStarted","Data":"02f60a28ce1fbb2d8f8d026c42aed60afd8a55d374e8bf9ac2b3ff9b13d696e1"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.348183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a75b087d-214f-4fed-a30c-d0d4f5607a08","Type":"ContainerStarted","Data":"e15162c9515fe470cc15f0a39c9ed51d6fca5d6bcb1b86e2097e5db304ef5670"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.374334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8582-account-create-update-d45wr" event={"ID":"868cb81a-07d6-4515-8129-32c1e5d06ca7","Type":"ContainerDied","Data":"f4603b6e6da8c818bfd2cfee912b69c6d0ff02b25f2c53c0d08338801a6462f2"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.374393 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4603b6e6da8c818bfd2cfee912b69c6d0ff02b25f2c53c0d08338801a6462f2" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.374481 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8582-account-create-update-d45wr" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.394999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b279b0e7-ca70-4cfd-92a9-c90f10658f69-kube-api-access-r96gs" (OuterVolumeSpecName: "kube-api-access-r96gs") pod "b279b0e7-ca70-4cfd-92a9-c90f10658f69" (UID: "b279b0e7-ca70-4cfd-92a9-c90f10658f69"). InnerVolumeSpecName "kube-api-access-r96gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.403018 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ts677" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.403257 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=20.403227648 podStartE2EDuration="20.403227648s" podCreationTimestamp="2026-02-18 06:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:27.402372875 +0000 UTC m=+1184.050332009" watchObservedRunningTime="2026-02-18 06:07:27.403227648 +0000 UTC m=+1184.051186782" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.405755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ts677" event={"ID":"b279b0e7-ca70-4cfd-92a9-c90f10658f69","Type":"ContainerDied","Data":"68bb1a9e25934853def1e7ee5313d626d0fb2cd4e622755f4b5b44ce30d4b6b6"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.409824 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68bb1a9e25934853def1e7ee5313d626d0fb2cd4e622755f4b5b44ce30d4b6b6" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.419470 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pftgx\" (UniqueName: \"kubernetes.io/projected/8b9ed69c-f65c-4a44-bf84-f9f911905bce-kube-api-access-pftgx\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.419501 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r96gs\" (UniqueName: \"kubernetes.io/projected/b279b0e7-ca70-4cfd-92a9-c90f10658f69-kube-api-access-r96gs\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.419510 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279b0e7-ca70-4cfd-92a9-c90f10658f69-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.424854 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f78d-account-create-update-tw92l" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.427896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f78d-account-create-update-tw92l" event={"ID":"8b9ed69c-f65c-4a44-bf84-f9f911905bce","Type":"ContainerDied","Data":"5c7cc078e55a98950cc9b468df14a38ddd60c9f906bba1c0545a819b20b69432"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.427945 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c7cc078e55a98950cc9b468df14a38ddd60c9f906bba1c0545a819b20b69432" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.447645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerStarted","Data":"f3aea98f9ebc71787df90190ee2c342416a766ffeca76171d3eeeae82321a0f3"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.447834 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.459859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f5brp" event={"ID":"dd09444c-e4f0-4a3b-afeb-6841e197b017","Type":"ContainerDied","Data":"4baadb0ba1981b41d1cf36a4e521a898fdad1aa423d7388bbbe56b69b356b634"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.460035 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4baadb0ba1981b41d1cf36a4e521a898fdad1aa423d7388bbbe56b69b356b634" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.468809 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04bdc22d-7e6e-428b-849a-45c041654404","Type":"ContainerStarted","Data":"01b203e3e1a4044af2239c110d7061065004ed40bc840a31af152119259df6b1"} Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.527720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m7sl\" (UniqueName: \"kubernetes.io/projected/dd09444c-e4f0-4a3b-afeb-6841e197b017-kube-api-access-2m7sl\") pod \"dd09444c-e4f0-4a3b-afeb-6841e197b017\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.527773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd09444c-e4f0-4a3b-afeb-6841e197b017-operator-scripts\") pod \"dd09444c-e4f0-4a3b-afeb-6841e197b017\" (UID: \"dd09444c-e4f0-4a3b-afeb-6841e197b017\") " Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.528647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd09444c-e4f0-4a3b-afeb-6841e197b017-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd09444c-e4f0-4a3b-afeb-6841e197b017" (UID: "dd09444c-e4f0-4a3b-afeb-6841e197b017"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.534547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd09444c-e4f0-4a3b-afeb-6841e197b017-kube-api-access-2m7sl" (OuterVolumeSpecName: "kube-api-access-2m7sl") pod "dd09444c-e4f0-4a3b-afeb-6841e197b017" (UID: "dd09444c-e4f0-4a3b-afeb-6841e197b017"). InnerVolumeSpecName "kube-api-access-2m7sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.632326 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m7sl\" (UniqueName: \"kubernetes.io/projected/dd09444c-e4f0-4a3b-afeb-6841e197b017-kube-api-access-2m7sl\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:27 crc kubenswrapper[4707]: I0218 06:07:27.632354 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd09444c-e4f0-4a3b-afeb-6841e197b017-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.167420 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.172696 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" path="/var/lib/kubelet/pods/eada398c-c6c5-4cbe-b221-1c3461fa8cd8/volumes" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.287045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.374443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c750576-ce13-4f43-a430-80a8d84b7829-operator-scripts\") pod \"3c750576-ce13-4f43-a430-80a8d84b7829\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.374564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdbj8\" (UniqueName: \"kubernetes.io/projected/3c750576-ce13-4f43-a430-80a8d84b7829-kube-api-access-sdbj8\") pod \"3c750576-ce13-4f43-a430-80a8d84b7829\" (UID: \"3c750576-ce13-4f43-a430-80a8d84b7829\") " Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.383645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c750576-ce13-4f43-a430-80a8d84b7829-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c750576-ce13-4f43-a430-80a8d84b7829" (UID: "3c750576-ce13-4f43-a430-80a8d84b7829"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.407608 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c750576-ce13-4f43-a430-80a8d84b7829-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.434057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c750576-ce13-4f43-a430-80a8d84b7829-kube-api-access-sdbj8" (OuterVolumeSpecName: "kube-api-access-sdbj8") pod "3c750576-ce13-4f43-a430-80a8d84b7829" (UID: "3c750576-ce13-4f43-a430-80a8d84b7829"). InnerVolumeSpecName "kube-api-access-sdbj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.448916 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.498198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f40b-account-create-update-55rmw" event={"ID":"3c750576-ce13-4f43-a430-80a8d84b7829","Type":"ContainerDied","Data":"4fc9567e911aa0ce5bf62a4ab132721c02a6427a57cc087217d9fe59b319e973"} Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.498254 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc9567e911aa0ce5bf62a4ab132721c02a6427a57cc087217d9fe59b319e973" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.498332 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f40b-account-create-update-55rmw" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.511130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjh27\" (UniqueName: \"kubernetes.io/projected/57e387c5-5644-44e5-9479-901efc3f88e8-kube-api-access-hjh27\") pod \"57e387c5-5644-44e5-9479-901efc3f88e8\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.511222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57e387c5-5644-44e5-9479-901efc3f88e8-operator-scripts\") pod \"57e387c5-5644-44e5-9479-901efc3f88e8\" (UID: \"57e387c5-5644-44e5-9479-901efc3f88e8\") " Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.512020 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdbj8\" (UniqueName: \"kubernetes.io/projected/3c750576-ce13-4f43-a430-80a8d84b7829-kube-api-access-sdbj8\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.512485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e387c5-5644-44e5-9479-901efc3f88e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57e387c5-5644-44e5-9479-901efc3f88e8" (UID: "57e387c5-5644-44e5-9479-901efc3f88e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.518338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e387c5-5644-44e5-9479-901efc3f88e8-kube-api-access-hjh27" (OuterVolumeSpecName: "kube-api-access-hjh27") pod "57e387c5-5644-44e5-9479-901efc3f88e8" (UID: "57e387c5-5644-44e5-9479-901efc3f88e8"). InnerVolumeSpecName "kube-api-access-hjh27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.525419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r5xft" event={"ID":"57e387c5-5644-44e5-9479-901efc3f88e8","Type":"ContainerDied","Data":"4b9b03a2f4fe28648b74618138f0a3c164f926abdd025722db780e04cce621bf"} Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.525477 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b9b03a2f4fe28648b74618138f0a3c164f926abdd025722db780e04cce621bf" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.525563 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5xft" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.530569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerStarted","Data":"75cbeadc822c587c10b7114ef7e9160737a56b41d46d5337460edab962627659"} Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.532591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"04bdc22d-7e6e-428b-849a-45c041654404","Type":"ContainerStarted","Data":"c5f43821b90095f601af84c9487f9737e526f0490484ded0cbc4def790c20c85"} Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.567213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"36b54f53-a798-4b8c-99ab-773ba732530b","Type":"ContainerStarted","Data":"ba544f1947c6ed26510d8c5988439e35670d4d667d28d6a8117adb9193343dc3"} Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.589875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b","Type":"ContainerStarted","Data":"879e12133c45cd76eb35f5a2310f3932c90b7cf132b0c8b09c7060c3eb86ace7"} Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.605554 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=6.605520671 podStartE2EDuration="6.605520671s" podCreationTimestamp="2026-02-18 06:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:28.558868965 +0000 UTC m=+1185.206828099" watchObservedRunningTime="2026-02-18 06:07:28.605520671 +0000 UTC m=+1185.253479795" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.613616 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.613599967 podStartE2EDuration="5.613599967s" podCreationTimestamp="2026-02-18 06:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:28.597647451 +0000 UTC m=+1185.245606585" watchObservedRunningTime="2026-02-18 06:07:28.613599967 +0000 UTC m=+1185.261559101" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.617407 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjh27\" (UniqueName: \"kubernetes.io/projected/57e387c5-5644-44e5-9479-901efc3f88e8-kube-api-access-hjh27\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.617453 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57e387c5-5644-44e5-9479-901efc3f88e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.618010 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f5brp" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.618139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2e24e699-659c-4701-9459-133197b510d7","Type":"ContainerStarted","Data":"ee207155cf4a6c7b8aeca232d012e3ae2fe45f0e32171a34a469d4212110ce6e"} Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.684523 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=21.68449643 podStartE2EDuration="21.68449643s" podCreationTimestamp="2026-02-18 06:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:28.669333825 +0000 UTC m=+1185.317292959" watchObservedRunningTime="2026-02-18 06:07:28.68449643 +0000 UTC m=+1185.332455554" Feb 18 06:07:28 crc kubenswrapper[4707]: I0218 06:07:28.688061 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.688049204 podStartE2EDuration="6.688049204s" podCreationTimestamp="2026-02-18 06:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:28.649196738 +0000 UTC m=+1185.297155862" watchObservedRunningTime="2026-02-18 06:07:28.688049204 +0000 UTC m=+1185.336008338" Feb 18 06:07:29 crc kubenswrapper[4707]: I0218 06:07:29.147402 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 18 06:07:29 crc kubenswrapper[4707]: I0218 06:07:29.630251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerStarted","Data":"ce56557931061df9974ad8d7cbc5442a80de320707086db4c2ca240221bcd252"} Feb 18 06:07:31 crc kubenswrapper[4707]: I0218 06:07:31.654267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerStarted","Data":"6b3d89bffd4f9dab95525833ec7d4442da82cf6a6f0f66ea648d16dfa5606c3f"} Feb 18 06:07:31 crc kubenswrapper[4707]: I0218 06:07:31.686860 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.23147168 podStartE2EDuration="8.686842438s" podCreationTimestamp="2026-02-18 06:07:23 +0000 UTC" firstStartedPulling="2026-02-18 06:07:25.083405214 +0000 UTC m=+1181.731364348" lastFinishedPulling="2026-02-18 06:07:30.538775972 +0000 UTC m=+1187.186735106" observedRunningTime="2026-02-18 06:07:31.68317822 +0000 UTC m=+1188.331137354" watchObservedRunningTime="2026-02-18 06:07:31.686842438 +0000 UTC m=+1188.334801572" Feb 18 06:07:32 crc kubenswrapper[4707]: I0218 06:07:32.666742 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.145112 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.185319 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.187853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-878756b99-xx5vn" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.277454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.277501 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.319915 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.342870 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.457766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.677570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.677637 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.727405 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 18 06:07:33 crc kubenswrapper[4707]: I0218 06:07:33.747952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.016723 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.411386 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77pfl"] Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.412158 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon-log" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.412258 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon-log" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.412351 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd09444c-e4f0-4a3b-afeb-6841e197b017" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.412456 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd09444c-e4f0-4a3b-afeb-6841e197b017" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.412556 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-httpd" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.412637 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-httpd" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.412721 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868cb81a-07d6-4515-8129-32c1e5d06ca7" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.412819 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="868cb81a-07d6-4515-8129-32c1e5d06ca7" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.412919 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-api" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.412997 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-api" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.413075 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e387c5-5644-44e5-9479-901efc3f88e8" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.413154 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e387c5-5644-44e5-9479-901efc3f88e8" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.413243 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.413327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.413411 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b279b0e7-ca70-4cfd-92a9-c90f10658f69" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.413482 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b279b0e7-ca70-4cfd-92a9-c90f10658f69" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.413576 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9ed69c-f65c-4a44-bf84-f9f911905bce" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.413665 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9ed69c-f65c-4a44-bf84-f9f911905bce" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: E0218 06:07:34.413741 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c750576-ce13-4f43-a430-80a8d84b7829" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.413846 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c750576-ce13-4f43-a430-80a8d84b7829" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414175 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd09444c-e4f0-4a3b-afeb-6841e197b017" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414280 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c750576-ce13-4f43-a430-80a8d84b7829" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414372 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon-log" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414460 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b279b0e7-ca70-4cfd-92a9-c90f10658f69" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414547 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="868cb81a-07d6-4515-8129-32c1e5d06ca7" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414638 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-httpd" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414730 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eada398c-c6c5-4cbe-b221-1c3461fa8cd8" containerName="neutron-api" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414842 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e387c5-5644-44e5-9479-901efc3f88e8" containerName="mariadb-database-create" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.414938 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9ed69c-f65c-4a44-bf84-f9f911905bce" containerName="mariadb-account-create-update" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.415017 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe56b299-664f-478b-9f30-8e2a4c457676" containerName="horizon" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.415964 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.418234 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.418243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-l9hgj" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.425107 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.434599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77pfl"] Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.574592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk87k\" (UniqueName: \"kubernetes.io/projected/ef28baea-267b-4872-bc1d-036169e0a9f2-kube-api-access-zk87k\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.575121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.575151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-config-data\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.575178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-scripts\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.603631 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.603980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.652577 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.653307 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.677204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk87k\" (UniqueName: \"kubernetes.io/projected/ef28baea-267b-4872-bc1d-036169e0a9f2-kube-api-access-zk87k\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.677253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.677288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-config-data\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.677319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-scripts\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.690193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.701409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk87k\" (UniqueName: \"kubernetes.io/projected/ef28baea-267b-4872-bc1d-036169e0a9f2-kube-api-access-zk87k\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.714488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-scripts\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.715487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-config-data\") pod \"nova-cell0-conductor-db-sync-77pfl\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.734682 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.734715 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.735329 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-central-agent" containerID="cri-o://f3aea98f9ebc71787df90190ee2c342416a766ffeca76171d3eeeae82321a0f3" gracePeriod=30 Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.736051 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="proxy-httpd" containerID="cri-o://6b3d89bffd4f9dab95525833ec7d4442da82cf6a6f0f66ea648d16dfa5606c3f" gracePeriod=30 Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.736125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="sg-core" containerID="cri-o://ce56557931061df9974ad8d7cbc5442a80de320707086db4c2ca240221bcd252" gracePeriod=30 Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.736179 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-notification-agent" containerID="cri-o://75cbeadc822c587c10b7114ef7e9160737a56b41d46d5337460edab962627659" gracePeriod=30 Feb 18 06:07:34 crc kubenswrapper[4707]: I0218 06:07:34.742692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.348681 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77pfl"] Feb 18 06:07:35 crc kubenswrapper[4707]: W0218 06:07:35.360168 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef28baea_267b_4872_bc1d_036169e0a9f2.slice/crio-1e8cb28269bd11b2209a7e81e6ac78014e26df11334a7253f6a2072f0f09b0df WatchSource:0}: Error finding container 1e8cb28269bd11b2209a7e81e6ac78014e26df11334a7253f6a2072f0f09b0df: Status 404 returned error can't find the container with id 1e8cb28269bd11b2209a7e81e6ac78014e26df11334a7253f6a2072f0f09b0df Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.742014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77pfl" event={"ID":"ef28baea-267b-4872-bc1d-036169e0a9f2","Type":"ContainerStarted","Data":"1e8cb28269bd11b2209a7e81e6ac78014e26df11334a7253f6a2072f0f09b0df"} Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.744998 4707 generic.go:334] "Generic (PLEG): container finished" podID="273cacd3-33db-4d08-ba70-69a5da859506" containerID="6b3d89bffd4f9dab95525833ec7d4442da82cf6a6f0f66ea648d16dfa5606c3f" exitCode=0 Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.745051 4707 generic.go:334] "Generic (PLEG): container finished" podID="273cacd3-33db-4d08-ba70-69a5da859506" containerID="ce56557931061df9974ad8d7cbc5442a80de320707086db4c2ca240221bcd252" exitCode=2 Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.745059 4707 generic.go:334] "Generic (PLEG): container finished" podID="273cacd3-33db-4d08-ba70-69a5da859506" containerID="75cbeadc822c587c10b7114ef7e9160737a56b41d46d5337460edab962627659" exitCode=0 Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.745067 4707 generic.go:334] "Generic (PLEG): container finished" podID="273cacd3-33db-4d08-ba70-69a5da859506" containerID="f3aea98f9ebc71787df90190ee2c342416a766ffeca76171d3eeeae82321a0f3" exitCode=0 Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.745103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerDied","Data":"6b3d89bffd4f9dab95525833ec7d4442da82cf6a6f0f66ea648d16dfa5606c3f"} Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.745200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerDied","Data":"ce56557931061df9974ad8d7cbc5442a80de320707086db4c2ca240221bcd252"} Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.745216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerDied","Data":"75cbeadc822c587c10b7114ef7e9160737a56b41d46d5337460edab962627659"} Feb 18 06:07:35 crc kubenswrapper[4707]: I0218 06:07:35.745230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerDied","Data":"f3aea98f9ebc71787df90190ee2c342416a766ffeca76171d3eeeae82321a0f3"} Feb 18 06:07:36 crc kubenswrapper[4707]: I0218 06:07:36.183387 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:07:36 crc kubenswrapper[4707]: I0218 06:07:36.183521 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:36 crc kubenswrapper[4707]: I0218 06:07:36.332597 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 06:07:36 crc kubenswrapper[4707]: I0218 06:07:36.754432 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:36 crc kubenswrapper[4707]: I0218 06:07:36.754740 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.204340 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336060 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-log-httpd\") pod \"273cacd3-33db-4d08-ba70-69a5da859506\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx5mx\" (UniqueName: \"kubernetes.io/projected/273cacd3-33db-4d08-ba70-69a5da859506-kube-api-access-jx5mx\") pod \"273cacd3-33db-4d08-ba70-69a5da859506\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-scripts\") pod \"273cacd3-33db-4d08-ba70-69a5da859506\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-sg-core-conf-yaml\") pod \"273cacd3-33db-4d08-ba70-69a5da859506\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-config-data\") pod \"273cacd3-33db-4d08-ba70-69a5da859506\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-run-httpd\") pod \"273cacd3-33db-4d08-ba70-69a5da859506\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-combined-ca-bundle\") pod \"273cacd3-33db-4d08-ba70-69a5da859506\" (UID: \"273cacd3-33db-4d08-ba70-69a5da859506\") " Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.336701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "273cacd3-33db-4d08-ba70-69a5da859506" (UID: "273cacd3-33db-4d08-ba70-69a5da859506"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.337282 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.337845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "273cacd3-33db-4d08-ba70-69a5da859506" (UID: "273cacd3-33db-4d08-ba70-69a5da859506"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.358242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273cacd3-33db-4d08-ba70-69a5da859506-kube-api-access-jx5mx" (OuterVolumeSpecName: "kube-api-access-jx5mx") pod "273cacd3-33db-4d08-ba70-69a5da859506" (UID: "273cacd3-33db-4d08-ba70-69a5da859506"). InnerVolumeSpecName "kube-api-access-jx5mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.369647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-scripts" (OuterVolumeSpecName: "scripts") pod "273cacd3-33db-4d08-ba70-69a5da859506" (UID: "273cacd3-33db-4d08-ba70-69a5da859506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.413921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "273cacd3-33db-4d08-ba70-69a5da859506" (UID: "273cacd3-33db-4d08-ba70-69a5da859506"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.415052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "273cacd3-33db-4d08-ba70-69a5da859506" (UID: "273cacd3-33db-4d08-ba70-69a5da859506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.438767 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.438813 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.438823 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/273cacd3-33db-4d08-ba70-69a5da859506-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.438832 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.438841 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx5mx\" (UniqueName: \"kubernetes.io/projected/273cacd3-33db-4d08-ba70-69a5da859506-kube-api-access-jx5mx\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.443905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-config-data" (OuterVolumeSpecName: "config-data") pod "273cacd3-33db-4d08-ba70-69a5da859506" (UID: "273cacd3-33db-4d08-ba70-69a5da859506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.542373 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273cacd3-33db-4d08-ba70-69a5da859506-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.764510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"273cacd3-33db-4d08-ba70-69a5da859506","Type":"ContainerDied","Data":"24a42d126172ae127f2d7d26c0f68bf90dd7972ea3a8b2ed4644bc54f5e2b8ed"} Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.764566 4707 scope.go:117] "RemoveContainer" containerID="6b3d89bffd4f9dab95525833ec7d4442da82cf6a6f0f66ea648d16dfa5606c3f" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.764704 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.796430 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.800264 4707 scope.go:117] "RemoveContainer" containerID="ce56557931061df9974ad8d7cbc5442a80de320707086db4c2ca240221bcd252" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.806828 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.842951 4707 scope.go:117] "RemoveContainer" containerID="75cbeadc822c587c10b7114ef7e9160737a56b41d46d5337460edab962627659" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.853104 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:37 crc kubenswrapper[4707]: E0218 06:07:37.853953 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="sg-core" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.853985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="sg-core" Feb 18 06:07:37 crc kubenswrapper[4707]: E0218 06:07:37.854011 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="proxy-httpd" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.854024 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="proxy-httpd" Feb 18 06:07:37 crc kubenswrapper[4707]: E0218 06:07:37.854061 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-central-agent" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.854075 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-central-agent" Feb 18 06:07:37 crc kubenswrapper[4707]: E0218 06:07:37.854112 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-notification-agent" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.854124 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-notification-agent" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.854528 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-central-agent" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.854561 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="sg-core" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.854593 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="ceilometer-notification-agent" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.854619 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="273cacd3-33db-4d08-ba70-69a5da859506" containerName="proxy-httpd" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.863427 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.863844 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.865988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.867584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.878718 4707 scope.go:117] "RemoveContainer" containerID="f3aea98f9ebc71787df90190ee2c342416a766ffeca76171d3eeeae82321a0f3" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.952089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-scripts\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.952147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xrp\" (UniqueName: \"kubernetes.io/projected/70e45364-65e7-4b3a-b847-66b140af070c-kube-api-access-s8xrp\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.952230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-log-httpd\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.952284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.952317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-config-data\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.952366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:37 crc kubenswrapper[4707]: I0218 06:07:37.952475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-run-httpd\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.054919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.054974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-config-data\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.055023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.055042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-run-httpd\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.055091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-scripts\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.055115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xrp\" (UniqueName: \"kubernetes.io/projected/70e45364-65e7-4b3a-b847-66b140af070c-kube-api-access-s8xrp\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.055163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-log-httpd\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.055834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-log-httpd\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.056008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-run-httpd\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.061195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.064069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.064628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-scripts\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.066750 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273cacd3-33db-4d08-ba70-69a5da859506" path="/var/lib/kubelet/pods/273cacd3-33db-4d08-ba70-69a5da859506/volumes" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.068180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.068267 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.069938 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.072499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-config-data\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.073577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xrp\" (UniqueName: \"kubernetes.io/projected/70e45364-65e7-4b3a-b847-66b140af070c-kube-api-access-s8xrp\") pod \"ceilometer-0\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.199873 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:38 crc kubenswrapper[4707]: I0218 06:07:38.818708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:39 crc kubenswrapper[4707]: I0218 06:07:39.800669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerStarted","Data":"3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a"} Feb 18 06:07:39 crc kubenswrapper[4707]: I0218 06:07:39.801589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerStarted","Data":"c3b0385e2345bdbb5ce82c41631e503a5fad7ee85b98136b165e708f4047d3b7"} Feb 18 06:07:40 crc kubenswrapper[4707]: I0218 06:07:40.817423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerStarted","Data":"a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7"} Feb 18 06:07:41 crc kubenswrapper[4707]: I0218 06:07:41.071195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 18 06:07:41 crc kubenswrapper[4707]: I0218 06:07:41.115590 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:07:41 crc kubenswrapper[4707]: I0218 06:07:41.851244 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="manila-share" containerID="cri-o://d16f09469b1584709ceedf9b9b8c956a317bde44a8320eac2be2750d1b527147" gracePeriod=30 Feb 18 06:07:41 crc kubenswrapper[4707]: I0218 06:07:41.851725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerStarted","Data":"c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb"} Feb 18 06:07:41 crc kubenswrapper[4707]: I0218 06:07:41.851818 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="probe" containerID="cri-o://c4c3512023062e1056a44c13ba76099910ee8d3d50aeb7bb1b5dbeccb1c0066a" gracePeriod=30 Feb 18 06:07:42 crc kubenswrapper[4707]: I0218 06:07:42.866666 4707 generic.go:334] "Generic (PLEG): container finished" podID="934db458-63c7-4b69-b4bf-70532ff71312" containerID="c4c3512023062e1056a44c13ba76099910ee8d3d50aeb7bb1b5dbeccb1c0066a" exitCode=0 Feb 18 06:07:42 crc kubenswrapper[4707]: I0218 06:07:42.867045 4707 generic.go:334] "Generic (PLEG): container finished" podID="934db458-63c7-4b69-b4bf-70532ff71312" containerID="d16f09469b1584709ceedf9b9b8c956a317bde44a8320eac2be2750d1b527147" exitCode=1 Feb 18 06:07:42 crc kubenswrapper[4707]: I0218 06:07:42.866774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"934db458-63c7-4b69-b4bf-70532ff71312","Type":"ContainerDied","Data":"c4c3512023062e1056a44c13ba76099910ee8d3d50aeb7bb1b5dbeccb1c0066a"} Feb 18 06:07:42 crc kubenswrapper[4707]: I0218 06:07:42.867095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"934db458-63c7-4b69-b4bf-70532ff71312","Type":"ContainerDied","Data":"d16f09469b1584709ceedf9b9b8c956a317bde44a8320eac2be2750d1b527147"} Feb 18 06:07:44 crc kubenswrapper[4707]: I0218 06:07:44.669961 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.320660 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-combined-ca-bundle\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-var-lib-manila\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcd94\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-kube-api-access-hcd94\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-etc-machine-id\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-ceph\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data-custom\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.450721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-scripts\") pod \"934db458-63c7-4b69-b4bf-70532ff71312\" (UID: \"934db458-63c7-4b69-b4bf-70532ff71312\") " Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.451048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.451685 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.451709 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/934db458-63c7-4b69-b4bf-70532ff71312-var-lib-manila\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.457031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.457172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-ceph" (OuterVolumeSpecName: "ceph") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.457185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-kube-api-access-hcd94" (OuterVolumeSpecName: "kube-api-access-hcd94") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "kube-api-access-hcd94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.458222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-scripts" (OuterVolumeSpecName: "scripts") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.500613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.554003 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.554037 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.554046 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.554056 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcd94\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-kube-api-access-hcd94\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.554069 4707 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/934db458-63c7-4b69-b4bf-70532ff71312-ceph\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.557056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data" (OuterVolumeSpecName: "config-data") pod "934db458-63c7-4b69-b4bf-70532ff71312" (UID: "934db458-63c7-4b69-b4bf-70532ff71312"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.656457 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934db458-63c7-4b69-b4bf-70532ff71312-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.917778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"934db458-63c7-4b69-b4bf-70532ff71312","Type":"ContainerDied","Data":"f1b6cb8c0a170a9490c293843fe57ecaccec4618fa9c160d8b2a1b5767e4a88a"} Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.918159 4707 scope.go:117] "RemoveContainer" containerID="c4c3512023062e1056a44c13ba76099910ee8d3d50aeb7bb1b5dbeccb1c0066a" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.917814 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.945952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerStarted","Data":"3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb"} Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.946109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.950405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77pfl" event={"ID":"ef28baea-267b-4872-bc1d-036169e0a9f2","Type":"ContainerStarted","Data":"0b4f1333566d26b8362f423ce93b5519a653c4f378e39637bb3fd75d9247ad78"} Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.966873 4707 scope.go:117] "RemoveContainer" containerID="d16f09469b1584709ceedf9b9b8c956a317bde44a8320eac2be2750d1b527147" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.987651 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.717772268 podStartE2EDuration="9.987629828s" podCreationTimestamp="2026-02-18 06:07:37 +0000 UTC" firstStartedPulling="2026-02-18 06:07:38.836352913 +0000 UTC m=+1195.484312047" lastFinishedPulling="2026-02-18 06:07:46.106210473 +0000 UTC m=+1202.754169607" observedRunningTime="2026-02-18 06:07:46.971890998 +0000 UTC m=+1203.619850132" watchObservedRunningTime="2026-02-18 06:07:46.987629828 +0000 UTC m=+1203.635588962" Feb 18 06:07:46 crc kubenswrapper[4707]: I0218 06:07:46.995032 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-77pfl" podStartSLOduration=2.2921314280000002 podStartE2EDuration="12.995011875s" podCreationTimestamp="2026-02-18 06:07:34 +0000 UTC" firstStartedPulling="2026-02-18 06:07:35.362103595 +0000 UTC m=+1192.010062729" lastFinishedPulling="2026-02-18 06:07:46.064984042 +0000 UTC m=+1202.712943176" observedRunningTime="2026-02-18 06:07:46.989783636 +0000 UTC m=+1203.637742780" watchObservedRunningTime="2026-02-18 06:07:46.995011875 +0000 UTC m=+1203.642971009" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.013872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.023789 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.039856 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:07:47 crc kubenswrapper[4707]: E0218 06:07:47.040310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="probe" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.040328 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="probe" Feb 18 06:07:47 crc kubenswrapper[4707]: E0218 06:07:47.040351 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="manila-share" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.040358 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="manila-share" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.040547 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="manila-share" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.040573 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="934db458-63c7-4b69-b4bf-70532ff71312" containerName="probe" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.041641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.044212 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.047913 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-scripts\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-ceph\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-config-data\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.170640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbrz\" (UniqueName: \"kubernetes.io/projected/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-kube-api-access-gbbrz\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.271817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.271866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-scripts\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.271888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-ceph\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.272086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.272107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.272129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.272517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-config-data\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.272522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.272551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbrz\" (UniqueName: \"kubernetes.io/projected/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-kube-api-access-gbbrz\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.272584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.277877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.278263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-config-data\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.279289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.282307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-ceph\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.282309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-scripts\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.294356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbrz\" (UniqueName: \"kubernetes.io/projected/1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b-kube-api-access-gbbrz\") pod \"manila-share-share1-0\" (UID: \"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b\") " pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.379668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 18 06:07:47 crc kubenswrapper[4707]: I0218 06:07:47.996557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 18 06:07:48 crc kubenswrapper[4707]: I0218 06:07:48.072982 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934db458-63c7-4b69-b4bf-70532ff71312" path="/var/lib/kubelet/pods/934db458-63c7-4b69-b4bf-70532ff71312/volumes" Feb 18 06:07:48 crc kubenswrapper[4707]: I0218 06:07:48.972871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b","Type":"ContainerStarted","Data":"4b55e6a82be430adccbd0a3670ba5eeeba1ab4734537c5cf70403f5348a6aad5"} Feb 18 06:07:48 crc kubenswrapper[4707]: I0218 06:07:48.973450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b","Type":"ContainerStarted","Data":"ee7c1544f2b793248ac0e122354f310887aa3625170b554e637c4c1a6f08b351"} Feb 18 06:07:48 crc kubenswrapper[4707]: I0218 06:07:48.973473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b","Type":"ContainerStarted","Data":"3c1c7d82417e59eccdb13ce82c4775f82b09dc58c225091a8010e650dc9bad25"} Feb 18 06:07:49 crc kubenswrapper[4707]: I0218 06:07:49.005344 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.005297894 podStartE2EDuration="2.005297894s" podCreationTimestamp="2026-02-18 06:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:07:48.99203693 +0000 UTC m=+1205.639996074" watchObservedRunningTime="2026-02-18 06:07:49.005297894 +0000 UTC m=+1205.653257068" Feb 18 06:07:50 crc kubenswrapper[4707]: I0218 06:07:50.331256 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:50 crc kubenswrapper[4707]: I0218 06:07:50.332342 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-central-agent" containerID="cri-o://3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a" gracePeriod=30 Feb 18 06:07:50 crc kubenswrapper[4707]: I0218 06:07:50.332991 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="proxy-httpd" containerID="cri-o://3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb" gracePeriod=30 Feb 18 06:07:50 crc kubenswrapper[4707]: I0218 06:07:50.333039 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="sg-core" containerID="cri-o://c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb" gracePeriod=30 Feb 18 06:07:50 crc kubenswrapper[4707]: I0218 06:07:50.333081 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-notification-agent" containerID="cri-o://a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7" gracePeriod=30 Feb 18 06:07:51 crc kubenswrapper[4707]: I0218 06:07:51.009602 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e45364-65e7-4b3a-b847-66b140af070c" containerID="3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb" exitCode=0 Feb 18 06:07:51 crc kubenswrapper[4707]: I0218 06:07:51.009661 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e45364-65e7-4b3a-b847-66b140af070c" containerID="c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb" exitCode=2 Feb 18 06:07:51 crc kubenswrapper[4707]: I0218 06:07:51.009679 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e45364-65e7-4b3a-b847-66b140af070c" containerID="3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a" exitCode=0 Feb 18 06:07:51 crc kubenswrapper[4707]: I0218 06:07:51.009702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerDied","Data":"3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb"} Feb 18 06:07:51 crc kubenswrapper[4707]: I0218 06:07:51.009781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerDied","Data":"c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb"} Feb 18 06:07:51 crc kubenswrapper[4707]: I0218 06:07:51.009830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerDied","Data":"3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a"} Feb 18 06:07:52 crc kubenswrapper[4707]: E0218 06:07:52.280293 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70e45364_65e7_4b3a_b847_66b140af070c.slice/crio-a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7.scope\": RecentStats: unable to find data in memory cache]" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.552270 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.693124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-scripts\") pod \"70e45364-65e7-4b3a-b847-66b140af070c\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.693247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-log-httpd\") pod \"70e45364-65e7-4b3a-b847-66b140af070c\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.693308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-combined-ca-bundle\") pod \"70e45364-65e7-4b3a-b847-66b140af070c\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.693361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8xrp\" (UniqueName: \"kubernetes.io/projected/70e45364-65e7-4b3a-b847-66b140af070c-kube-api-access-s8xrp\") pod \"70e45364-65e7-4b3a-b847-66b140af070c\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.693386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-sg-core-conf-yaml\") pod \"70e45364-65e7-4b3a-b847-66b140af070c\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.693416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-config-data\") pod \"70e45364-65e7-4b3a-b847-66b140af070c\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.693473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-run-httpd\") pod \"70e45364-65e7-4b3a-b847-66b140af070c\" (UID: \"70e45364-65e7-4b3a-b847-66b140af070c\") " Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.694192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70e45364-65e7-4b3a-b847-66b140af070c" (UID: "70e45364-65e7-4b3a-b847-66b140af070c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.694653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70e45364-65e7-4b3a-b847-66b140af070c" (UID: "70e45364-65e7-4b3a-b847-66b140af070c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.700997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-scripts" (OuterVolumeSpecName: "scripts") pod "70e45364-65e7-4b3a-b847-66b140af070c" (UID: "70e45364-65e7-4b3a-b847-66b140af070c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.704943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e45364-65e7-4b3a-b847-66b140af070c-kube-api-access-s8xrp" (OuterVolumeSpecName: "kube-api-access-s8xrp") pod "70e45364-65e7-4b3a-b847-66b140af070c" (UID: "70e45364-65e7-4b3a-b847-66b140af070c"). InnerVolumeSpecName "kube-api-access-s8xrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.725598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70e45364-65e7-4b3a-b847-66b140af070c" (UID: "70e45364-65e7-4b3a-b847-66b140af070c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.773393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70e45364-65e7-4b3a-b847-66b140af070c" (UID: "70e45364-65e7-4b3a-b847-66b140af070c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.795623 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8xrp\" (UniqueName: \"kubernetes.io/projected/70e45364-65e7-4b3a-b847-66b140af070c-kube-api-access-s8xrp\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.796164 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.796177 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.796189 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.796202 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70e45364-65e7-4b3a-b847-66b140af070c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.796214 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.805933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-config-data" (OuterVolumeSpecName: "config-data") pod "70e45364-65e7-4b3a-b847-66b140af070c" (UID: "70e45364-65e7-4b3a-b847-66b140af070c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:07:52 crc kubenswrapper[4707]: I0218 06:07:52.899415 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e45364-65e7-4b3a-b847-66b140af070c-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.060873 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e45364-65e7-4b3a-b847-66b140af070c" containerID="a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7" exitCode=0 Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.060956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerDied","Data":"a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7"} Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.061046 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.061088 4707 scope.go:117] "RemoveContainer" containerID="3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.061067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70e45364-65e7-4b3a-b847-66b140af070c","Type":"ContainerDied","Data":"c3b0385e2345bdbb5ce82c41631e503a5fad7ee85b98136b165e708f4047d3b7"} Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.101262 4707 scope.go:117] "RemoveContainer" containerID="c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.109911 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.120489 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.145124 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.145808 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="proxy-httpd" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.145826 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="proxy-httpd" Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.145839 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-notification-agent" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.145846 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-notification-agent" Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.145873 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-central-agent" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.145880 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-central-agent" Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.145894 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="sg-core" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.145899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="sg-core" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.146073 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-central-agent" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.146084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="sg-core" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.146102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="proxy-httpd" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.146113 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e45364-65e7-4b3a-b847-66b140af070c" containerName="ceilometer-notification-agent" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.147372 4707 scope.go:117] "RemoveContainer" containerID="a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.148027 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.154649 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.154830 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.165507 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.214391 4707 scope.go:117] "RemoveContainer" containerID="3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.241256 4707 scope.go:117] "RemoveContainer" containerID="3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb" Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.241840 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb\": container with ID starting with 3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb not found: ID does not exist" containerID="3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.241875 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb"} err="failed to get container status \"3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb\": rpc error: code = NotFound desc = could not find container \"3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb\": container with ID starting with 3111f4447b1dd12cde2f8e77b2c330af2adc2301b00019419cb0ab43a099e5fb not found: ID does not exist" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.241897 4707 scope.go:117] "RemoveContainer" containerID="c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb" Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.242204 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb\": container with ID starting with c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb not found: ID does not exist" containerID="c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.242236 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb"} err="failed to get container status \"c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb\": rpc error: code = NotFound desc = could not find container \"c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb\": container with ID starting with c4e9d4946090500e604c3492ae0e9cb4b303d3f4620522ae2d32f3eeb81994eb not found: ID does not exist" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.242256 4707 scope.go:117] "RemoveContainer" containerID="a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7" Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.242460 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7\": container with ID starting with a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7 not found: ID does not exist" containerID="a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.242491 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7"} err="failed to get container status \"a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7\": rpc error: code = NotFound desc = could not find container \"a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7\": container with ID starting with a62cce2d2a0fd2f8b32d61ae59469bb4d57eed5b784c63e5477058062d5218c7 not found: ID does not exist" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.242504 4707 scope.go:117] "RemoveContainer" containerID="3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a" Feb 18 06:07:53 crc kubenswrapper[4707]: E0218 06:07:53.242953 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a\": container with ID starting with 3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a not found: ID does not exist" containerID="3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.242981 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a"} err="failed to get container status \"3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a\": rpc error: code = NotFound desc = could not find container \"3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a\": container with ID starting with 3a6aa422120e8764b1a4ff43702133ef53afaa5e897701cfefe46b966fbfd39a not found: ID does not exist" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.307137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.307178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.307307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-config-data\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.307360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-scripts\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.307397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdzf\" (UniqueName: \"kubernetes.io/projected/be589645-269d-467b-80c1-d975a5c43d1a-kube-api-access-qvdzf\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.307509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-log-httpd\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.307601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-run-httpd\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.410621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-config-data\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.410693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-scripts\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.410730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdzf\" (UniqueName: \"kubernetes.io/projected/be589645-269d-467b-80c1-d975a5c43d1a-kube-api-access-qvdzf\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.410777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-log-httpd\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.410820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-run-httpd\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.410888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.410905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.411699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-log-httpd\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.411965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-run-httpd\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.416781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-scripts\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.419211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.423628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.424271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-config-data\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.436890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdzf\" (UniqueName: \"kubernetes.io/projected/be589645-269d-467b-80c1-d975a5c43d1a-kube-api-access-qvdzf\") pod \"ceilometer-0\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.512006 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:07:53 crc kubenswrapper[4707]: I0218 06:07:53.971148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:07:54 crc kubenswrapper[4707]: I0218 06:07:54.063646 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e45364-65e7-4b3a-b847-66b140af070c" path="/var/lib/kubelet/pods/70e45364-65e7-4b3a-b847-66b140af070c/volumes" Feb 18 06:07:54 crc kubenswrapper[4707]: I0218 06:07:54.073178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerStarted","Data":"300aad25c21a60329237f6ce0d1928fa4f3bd3438a626e28a1e9bb6053bba7d4"} Feb 18 06:07:55 crc kubenswrapper[4707]: I0218 06:07:55.083259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerStarted","Data":"ba8a0b50a44cc81c8b8a67d24071d704b522c4f2e5581b579a2afe9cafc249d5"} Feb 18 06:07:57 crc kubenswrapper[4707]: I0218 06:07:57.110098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerStarted","Data":"31a343bd1e010b8b86bd5da0ee210fcfdad1afb9f5ff4b7c634c3ab1438d9f6b"} Feb 18 06:07:57 crc kubenswrapper[4707]: I0218 06:07:57.110703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerStarted","Data":"ecc74b794fdfdf47f36e2bc32c7d5f7b0c9a707e231c2dd9f8df52978f6d33d0"} Feb 18 06:07:57 crc kubenswrapper[4707]: I0218 06:07:57.381070 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 18 06:07:59 crc kubenswrapper[4707]: I0218 06:07:59.129624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerStarted","Data":"d79b99a17ac642f5346d193cfad3848440e6d39f1524c80f3f0eac30c2efdf4c"} Feb 18 06:07:59 crc kubenswrapper[4707]: I0218 06:07:59.130104 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:07:59 crc kubenswrapper[4707]: I0218 06:07:59.132444 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef28baea-267b-4872-bc1d-036169e0a9f2" containerID="0b4f1333566d26b8362f423ce93b5519a653c4f378e39637bb3fd75d9247ad78" exitCode=0 Feb 18 06:07:59 crc kubenswrapper[4707]: I0218 06:07:59.132508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77pfl" event={"ID":"ef28baea-267b-4872-bc1d-036169e0a9f2","Type":"ContainerDied","Data":"0b4f1333566d26b8362f423ce93b5519a653c4f378e39637bb3fd75d9247ad78"} Feb 18 06:07:59 crc kubenswrapper[4707]: I0218 06:07:59.180256 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.514100469 podStartE2EDuration="6.180230334s" podCreationTimestamp="2026-02-18 06:07:53 +0000 UTC" firstStartedPulling="2026-02-18 06:07:53.972029745 +0000 UTC m=+1210.619988889" lastFinishedPulling="2026-02-18 06:07:58.63815962 +0000 UTC m=+1215.286118754" observedRunningTime="2026-02-18 06:07:59.156211833 +0000 UTC m=+1215.804170967" watchObservedRunningTime="2026-02-18 06:07:59.180230334 +0000 UTC m=+1215.828189508" Feb 18 06:07:59 crc kubenswrapper[4707]: I0218 06:07:59.818361 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.495595 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.661475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-scripts\") pod \"ef28baea-267b-4872-bc1d-036169e0a9f2\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.662276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-combined-ca-bundle\") pod \"ef28baea-267b-4872-bc1d-036169e0a9f2\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.662354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-config-data\") pod \"ef28baea-267b-4872-bc1d-036169e0a9f2\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.662469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk87k\" (UniqueName: \"kubernetes.io/projected/ef28baea-267b-4872-bc1d-036169e0a9f2-kube-api-access-zk87k\") pod \"ef28baea-267b-4872-bc1d-036169e0a9f2\" (UID: \"ef28baea-267b-4872-bc1d-036169e0a9f2\") " Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.676972 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-scripts" (OuterVolumeSpecName: "scripts") pod "ef28baea-267b-4872-bc1d-036169e0a9f2" (UID: "ef28baea-267b-4872-bc1d-036169e0a9f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.681848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef28baea-267b-4872-bc1d-036169e0a9f2-kube-api-access-zk87k" (OuterVolumeSpecName: "kube-api-access-zk87k") pod "ef28baea-267b-4872-bc1d-036169e0a9f2" (UID: "ef28baea-267b-4872-bc1d-036169e0a9f2"). InnerVolumeSpecName "kube-api-access-zk87k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.693531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef28baea-267b-4872-bc1d-036169e0a9f2" (UID: "ef28baea-267b-4872-bc1d-036169e0a9f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.706131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-config-data" (OuterVolumeSpecName: "config-data") pod "ef28baea-267b-4872-bc1d-036169e0a9f2" (UID: "ef28baea-267b-4872-bc1d-036169e0a9f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.765997 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.766882 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk87k\" (UniqueName: \"kubernetes.io/projected/ef28baea-267b-4872-bc1d-036169e0a9f2-kube-api-access-zk87k\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.767031 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:00 crc kubenswrapper[4707]: I0218 06:08:00.767111 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef28baea-267b-4872-bc1d-036169e0a9f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.152665 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-central-agent" containerID="cri-o://ba8a0b50a44cc81c8b8a67d24071d704b522c4f2e5581b579a2afe9cafc249d5" gracePeriod=30 Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.152760 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77pfl" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.153585 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="sg-core" containerID="cri-o://31a343bd1e010b8b86bd5da0ee210fcfdad1afb9f5ff4b7c634c3ab1438d9f6b" gracePeriod=30 Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.153623 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-notification-agent" containerID="cri-o://ecc74b794fdfdf47f36e2bc32c7d5f7b0c9a707e231c2dd9f8df52978f6d33d0" gracePeriod=30 Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.153657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77pfl" event={"ID":"ef28baea-267b-4872-bc1d-036169e0a9f2","Type":"ContainerDied","Data":"1e8cb28269bd11b2209a7e81e6ac78014e26df11334a7253f6a2072f0f09b0df"} Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.153681 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8cb28269bd11b2209a7e81e6ac78014e26df11334a7253f6a2072f0f09b0df" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.154023 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="proxy-httpd" containerID="cri-o://d79b99a17ac642f5346d193cfad3848440e6d39f1524c80f3f0eac30c2efdf4c" gracePeriod=30 Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.280287 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:01 crc kubenswrapper[4707]: E0218 06:08:01.280715 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef28baea-267b-4872-bc1d-036169e0a9f2" containerName="nova-cell0-conductor-db-sync" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.280736 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef28baea-267b-4872-bc1d-036169e0a9f2" containerName="nova-cell0-conductor-db-sync" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.280981 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef28baea-267b-4872-bc1d-036169e0a9f2" containerName="nova-cell0-conductor-db-sync" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.281699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.285846 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.286087 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-l9hgj" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.295213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.381867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqs7h\" (UniqueName: \"kubernetes.io/projected/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-kube-api-access-tqs7h\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.381924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.381953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.484313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqs7h\" (UniqueName: \"kubernetes.io/projected/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-kube-api-access-tqs7h\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.484774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.484938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.489400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.492427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.502504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqs7h\" (UniqueName: \"kubernetes.io/projected/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-kube-api-access-tqs7h\") pod \"nova-cell0-conductor-0\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.664727 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:01 crc kubenswrapper[4707]: I0218 06:08:01.665563 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174343 4707 generic.go:334] "Generic (PLEG): container finished" podID="be589645-269d-467b-80c1-d975a5c43d1a" containerID="d79b99a17ac642f5346d193cfad3848440e6d39f1524c80f3f0eac30c2efdf4c" exitCode=0 Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174743 4707 generic.go:334] "Generic (PLEG): container finished" podID="be589645-269d-467b-80c1-d975a5c43d1a" containerID="31a343bd1e010b8b86bd5da0ee210fcfdad1afb9f5ff4b7c634c3ab1438d9f6b" exitCode=2 Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174755 4707 generic.go:334] "Generic (PLEG): container finished" podID="be589645-269d-467b-80c1-d975a5c43d1a" containerID="ecc74b794fdfdf47f36e2bc32c7d5f7b0c9a707e231c2dd9f8df52978f6d33d0" exitCode=0 Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174762 4707 generic.go:334] "Generic (PLEG): container finished" podID="be589645-269d-467b-80c1-d975a5c43d1a" containerID="ba8a0b50a44cc81c8b8a67d24071d704b522c4f2e5581b579a2afe9cafc249d5" exitCode=0 Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerDied","Data":"d79b99a17ac642f5346d193cfad3848440e6d39f1524c80f3f0eac30c2efdf4c"} Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerDied","Data":"31a343bd1e010b8b86bd5da0ee210fcfdad1afb9f5ff4b7c634c3ab1438d9f6b"} Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerDied","Data":"ecc74b794fdfdf47f36e2bc32c7d5f7b0c9a707e231c2dd9f8df52978f6d33d0"} Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.174905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerDied","Data":"ba8a0b50a44cc81c8b8a67d24071d704b522c4f2e5581b579a2afe9cafc249d5"} Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.203385 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.308535 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.443365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-sg-core-conf-yaml\") pod \"be589645-269d-467b-80c1-d975a5c43d1a\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.450101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-log-httpd\") pod \"be589645-269d-467b-80c1-d975a5c43d1a\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.450165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-scripts\") pod \"be589645-269d-467b-80c1-d975a5c43d1a\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.450226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-combined-ca-bundle\") pod \"be589645-269d-467b-80c1-d975a5c43d1a\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.450271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-config-data\") pod \"be589645-269d-467b-80c1-d975a5c43d1a\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.450299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvdzf\" (UniqueName: \"kubernetes.io/projected/be589645-269d-467b-80c1-d975a5c43d1a-kube-api-access-qvdzf\") pod \"be589645-269d-467b-80c1-d975a5c43d1a\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.450341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-run-httpd\") pod \"be589645-269d-467b-80c1-d975a5c43d1a\" (UID: \"be589645-269d-467b-80c1-d975a5c43d1a\") " Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.455317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "be589645-269d-467b-80c1-d975a5c43d1a" (UID: "be589645-269d-467b-80c1-d975a5c43d1a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.455720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "be589645-269d-467b-80c1-d975a5c43d1a" (UID: "be589645-269d-467b-80c1-d975a5c43d1a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.458147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be589645-269d-467b-80c1-d975a5c43d1a-kube-api-access-qvdzf" (OuterVolumeSpecName: "kube-api-access-qvdzf") pod "be589645-269d-467b-80c1-d975a5c43d1a" (UID: "be589645-269d-467b-80c1-d975a5c43d1a"). InnerVolumeSpecName "kube-api-access-qvdzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.460009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-scripts" (OuterVolumeSpecName: "scripts") pod "be589645-269d-467b-80c1-d975a5c43d1a" (UID: "be589645-269d-467b-80c1-d975a5c43d1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.481599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "be589645-269d-467b-80c1-d975a5c43d1a" (UID: "be589645-269d-467b-80c1-d975a5c43d1a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.555342 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.555380 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.555394 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.555405 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvdzf\" (UniqueName: \"kubernetes.io/projected/be589645-269d-467b-80c1-d975a5c43d1a-kube-api-access-qvdzf\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.555417 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be589645-269d-467b-80c1-d975a5c43d1a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.572464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be589645-269d-467b-80c1-d975a5c43d1a" (UID: "be589645-269d-467b-80c1-d975a5c43d1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.603842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-config-data" (OuterVolumeSpecName: "config-data") pod "be589645-269d-467b-80c1-d975a5c43d1a" (UID: "be589645-269d-467b-80c1-d975a5c43d1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.656382 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:02 crc kubenswrapper[4707]: I0218 06:08:02.656738 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be589645-269d-467b-80c1-d975a5c43d1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.191505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c3fe6993-5a3f-486e-90e7-1c54a4e846ec","Type":"ContainerStarted","Data":"2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15"} Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.191883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c3fe6993-5a3f-486e-90e7-1c54a4e846ec","Type":"ContainerStarted","Data":"54c626d8becd8e5c2f5edb84d4347ddc85b7f2a544c7b446a53ccbe9f6458b44"} Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.192456 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" containerID="cri-o://2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" gracePeriod=30 Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.199215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be589645-269d-467b-80c1-d975a5c43d1a","Type":"ContainerDied","Data":"300aad25c21a60329237f6ce0d1928fa4f3bd3438a626e28a1e9bb6053bba7d4"} Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.199283 4707 scope.go:117] "RemoveContainer" containerID="d79b99a17ac642f5346d193cfad3848440e6d39f1524c80f3f0eac30c2efdf4c" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.199508 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.216848 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.216821939 podStartE2EDuration="2.216821939s" podCreationTimestamp="2026-02-18 06:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:03.208692061 +0000 UTC m=+1219.856651235" watchObservedRunningTime="2026-02-18 06:08:03.216821939 +0000 UTC m=+1219.864781073" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.241319 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.248001 4707 scope.go:117] "RemoveContainer" containerID="31a343bd1e010b8b86bd5da0ee210fcfdad1afb9f5ff4b7c634c3ab1438d9f6b" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.256945 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.264834 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:03 crc kubenswrapper[4707]: E0218 06:08:03.265303 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="proxy-httpd" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.265324 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="proxy-httpd" Feb 18 06:08:03 crc kubenswrapper[4707]: E0218 06:08:03.265353 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="sg-core" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.265362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="sg-core" Feb 18 06:08:03 crc kubenswrapper[4707]: E0218 06:08:03.265408 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-central-agent" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.265418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-central-agent" Feb 18 06:08:03 crc kubenswrapper[4707]: E0218 06:08:03.265440 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-notification-agent" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.265448 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-notification-agent" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.266717 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-central-agent" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.266749 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="sg-core" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.266763 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="proxy-httpd" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.266773 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be589645-269d-467b-80c1-d975a5c43d1a" containerName="ceilometer-notification-agent" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.269214 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.275490 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.280943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.281110 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.287896 4707 scope.go:117] "RemoveContainer" containerID="ecc74b794fdfdf47f36e2bc32c7d5f7b0c9a707e231c2dd9f8df52978f6d33d0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.314963 4707 scope.go:117] "RemoveContainer" containerID="ba8a0b50a44cc81c8b8a67d24071d704b522c4f2e5581b579a2afe9cafc249d5" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.371018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8nt\" (UniqueName: \"kubernetes.io/projected/6324588c-728c-4540-ab29-2ca00b24ae39-kube-api-access-xt8nt\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.371092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-config-data\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.371128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.371154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-log-httpd\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.371184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.372703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-run-httpd\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.372781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-scripts\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.408215 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:03 crc kubenswrapper[4707]: E0218 06:08:03.409098 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-xt8nt log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="6324588c-728c-4540-ab29-2ca00b24ae39" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.474094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-config-data\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.474144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.474162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-log-httpd\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.474193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.474234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-run-httpd\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.474265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-scripts\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.474354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8nt\" (UniqueName: \"kubernetes.io/projected/6324588c-728c-4540-ab29-2ca00b24ae39-kube-api-access-xt8nt\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.475082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-log-httpd\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.475084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-run-httpd\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.479778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-scripts\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.480760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-config-data\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.481251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.489573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8nt\" (UniqueName: \"kubernetes.io/projected/6324588c-728c-4540-ab29-2ca00b24ae39-kube-api-access-xt8nt\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:03 crc kubenswrapper[4707]: I0218 06:08:03.491456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " pod="openstack/ceilometer-0" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.082292 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be589645-269d-467b-80c1-d975a5c43d1a" path="/var/lib/kubelet/pods/be589645-269d-467b-80c1-d975a5c43d1a/volumes" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.217581 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.230691 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.391424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-log-httpd\") pod \"6324588c-728c-4540-ab29-2ca00b24ae39\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.391479 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-scripts\") pod \"6324588c-728c-4540-ab29-2ca00b24ae39\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.391515 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-config-data\") pod \"6324588c-728c-4540-ab29-2ca00b24ae39\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.391539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-run-httpd\") pod \"6324588c-728c-4540-ab29-2ca00b24ae39\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.391593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt8nt\" (UniqueName: \"kubernetes.io/projected/6324588c-728c-4540-ab29-2ca00b24ae39-kube-api-access-xt8nt\") pod \"6324588c-728c-4540-ab29-2ca00b24ae39\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.391844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6324588c-728c-4540-ab29-2ca00b24ae39" (UID: "6324588c-728c-4540-ab29-2ca00b24ae39"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.391964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6324588c-728c-4540-ab29-2ca00b24ae39" (UID: "6324588c-728c-4540-ab29-2ca00b24ae39"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.392222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-sg-core-conf-yaml\") pod \"6324588c-728c-4540-ab29-2ca00b24ae39\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.392262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-combined-ca-bundle\") pod \"6324588c-728c-4540-ab29-2ca00b24ae39\" (UID: \"6324588c-728c-4540-ab29-2ca00b24ae39\") " Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.392832 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.392854 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6324588c-728c-4540-ab29-2ca00b24ae39-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.395710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6324588c-728c-4540-ab29-2ca00b24ae39-kube-api-access-xt8nt" (OuterVolumeSpecName: "kube-api-access-xt8nt") pod "6324588c-728c-4540-ab29-2ca00b24ae39" (UID: "6324588c-728c-4540-ab29-2ca00b24ae39"). InnerVolumeSpecName "kube-api-access-xt8nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.396539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-scripts" (OuterVolumeSpecName: "scripts") pod "6324588c-728c-4540-ab29-2ca00b24ae39" (UID: "6324588c-728c-4540-ab29-2ca00b24ae39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.397318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6324588c-728c-4540-ab29-2ca00b24ae39" (UID: "6324588c-728c-4540-ab29-2ca00b24ae39"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.398275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6324588c-728c-4540-ab29-2ca00b24ae39" (UID: "6324588c-728c-4540-ab29-2ca00b24ae39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.409955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-config-data" (OuterVolumeSpecName: "config-data") pod "6324588c-728c-4540-ab29-2ca00b24ae39" (UID: "6324588c-728c-4540-ab29-2ca00b24ae39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.494751 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.494817 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.494830 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.494843 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6324588c-728c-4540-ab29-2ca00b24ae39-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:04 crc kubenswrapper[4707]: I0218 06:08:04.494855 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt8nt\" (UniqueName: \"kubernetes.io/projected/6324588c-728c-4540-ab29-2ca00b24ae39-kube-api-access-xt8nt\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.226065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.285103 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.308528 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.325157 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.345602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.347925 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.354069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.357685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.520583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.520632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-config-data\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.520691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-scripts\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.520708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.520763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-run-httpd\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.520810 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkhhs\" (UniqueName: \"kubernetes.io/projected/91fb305a-2ad7-4136-9378-38efe97fd482-kube-api-access-zkhhs\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.520840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-log-httpd\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.622046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-scripts\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.622105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.622187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-run-httpd\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.622233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkhhs\" (UniqueName: \"kubernetes.io/projected/91fb305a-2ad7-4136-9378-38efe97fd482-kube-api-access-zkhhs\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.622281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-log-httpd\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.622327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.622358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-config-data\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.623772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-log-httpd\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.624339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-run-httpd\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.626931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.627093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.633509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-scripts\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.634219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-config-data\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.638931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkhhs\" (UniqueName: \"kubernetes.io/projected/91fb305a-2ad7-4136-9378-38efe97fd482-kube-api-access-zkhhs\") pod \"ceilometer-0\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " pod="openstack/ceilometer-0" Feb 18 06:08:05 crc kubenswrapper[4707]: I0218 06:08:05.669044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:06 crc kubenswrapper[4707]: I0218 06:08:06.064964 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6324588c-728c-4540-ab29-2ca00b24ae39" path="/var/lib/kubelet/pods/6324588c-728c-4540-ab29-2ca00b24ae39/volumes" Feb 18 06:08:06 crc kubenswrapper[4707]: I0218 06:08:06.114948 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:06 crc kubenswrapper[4707]: I0218 06:08:06.236552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerStarted","Data":"239eb4c45e5fe8bc8435c0956c40a931784c1e96e1cbb670cbcaa770856f73c4"} Feb 18 06:08:06 crc kubenswrapper[4707]: I0218 06:08:06.666285 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:07 crc kubenswrapper[4707]: I0218 06:08:07.248849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerStarted","Data":"9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8"} Feb 18 06:08:08 crc kubenswrapper[4707]: I0218 06:08:08.274025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerStarted","Data":"58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7"} Feb 18 06:08:08 crc kubenswrapper[4707]: I0218 06:08:08.274490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerStarted","Data":"3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8"} Feb 18 06:08:08 crc kubenswrapper[4707]: I0218 06:08:08.977180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 18 06:08:10 crc kubenswrapper[4707]: I0218 06:08:10.295636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerStarted","Data":"ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74"} Feb 18 06:08:10 crc kubenswrapper[4707]: I0218 06:08:10.296172 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:08:10 crc kubenswrapper[4707]: I0218 06:08:10.323846 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.917474913 podStartE2EDuration="5.323818389s" podCreationTimestamp="2026-02-18 06:08:05 +0000 UTC" firstStartedPulling="2026-02-18 06:08:06.109203951 +0000 UTC m=+1222.757163095" lastFinishedPulling="2026-02-18 06:08:09.515547437 +0000 UTC m=+1226.163506571" observedRunningTime="2026-02-18 06:08:10.323172012 +0000 UTC m=+1226.971131226" watchObservedRunningTime="2026-02-18 06:08:10.323818389 +0000 UTC m=+1226.971777533" Feb 18 06:08:11 crc kubenswrapper[4707]: E0218 06:08:11.667774 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:11 crc kubenswrapper[4707]: E0218 06:08:11.669506 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:11 crc kubenswrapper[4707]: E0218 06:08:11.670498 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:11 crc kubenswrapper[4707]: E0218 06:08:11.670535 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:16 crc kubenswrapper[4707]: E0218 06:08:16.669401 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:16 crc kubenswrapper[4707]: E0218 06:08:16.671400 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:16 crc kubenswrapper[4707]: E0218 06:08:16.673283 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:16 crc kubenswrapper[4707]: E0218 06:08:16.673466 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:21 crc kubenswrapper[4707]: I0218 06:08:21.382724 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:08:21 crc kubenswrapper[4707]: I0218 06:08:21.383093 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:08:21 crc kubenswrapper[4707]: E0218 06:08:21.678531 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:21 crc kubenswrapper[4707]: E0218 06:08:21.681563 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:21 crc kubenswrapper[4707]: E0218 06:08:21.683983 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:21 crc kubenswrapper[4707]: E0218 06:08:21.684064 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:26 crc kubenswrapper[4707]: E0218 06:08:26.667848 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:26 crc kubenswrapper[4707]: E0218 06:08:26.669947 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:26 crc kubenswrapper[4707]: E0218 06:08:26.671269 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:26 crc kubenswrapper[4707]: E0218 06:08:26.671364 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:31 crc kubenswrapper[4707]: E0218 06:08:31.669166 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:31 crc kubenswrapper[4707]: E0218 06:08:31.672239 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:31 crc kubenswrapper[4707]: E0218 06:08:31.673992 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 18 06:08:31 crc kubenswrapper[4707]: E0218 06:08:31.674063 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:33 crc kubenswrapper[4707]: E0218 06:08:33.335076 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3fe6993_5a3f_486e_90e7_1c54a4e846ec.slice/crio-2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15.scope\": RecentStats: unable to find data in memory cache]" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.598494 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.722583 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" exitCode=137 Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.722604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-combined-ca-bundle\") pod \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.722669 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.722683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c3fe6993-5a3f-486e-90e7-1c54a4e846ec","Type":"ContainerDied","Data":"2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15"} Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.722736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c3fe6993-5a3f-486e-90e7-1c54a4e846ec","Type":"ContainerDied","Data":"54c626d8becd8e5c2f5edb84d4347ddc85b7f2a544c7b446a53ccbe9f6458b44"} Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.722758 4707 scope.go:117] "RemoveContainer" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.723081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqs7h\" (UniqueName: \"kubernetes.io/projected/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-kube-api-access-tqs7h\") pod \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.723212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-config-data\") pod \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\" (UID: \"c3fe6993-5a3f-486e-90e7-1c54a4e846ec\") " Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.745033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-kube-api-access-tqs7h" (OuterVolumeSpecName: "kube-api-access-tqs7h") pod "c3fe6993-5a3f-486e-90e7-1c54a4e846ec" (UID: "c3fe6993-5a3f-486e-90e7-1c54a4e846ec"). InnerVolumeSpecName "kube-api-access-tqs7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.757943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-config-data" (OuterVolumeSpecName: "config-data") pod "c3fe6993-5a3f-486e-90e7-1c54a4e846ec" (UID: "c3fe6993-5a3f-486e-90e7-1c54a4e846ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.773016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3fe6993-5a3f-486e-90e7-1c54a4e846ec" (UID: "c3fe6993-5a3f-486e-90e7-1c54a4e846ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.825643 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.825671 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.825683 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqs7h\" (UniqueName: \"kubernetes.io/projected/c3fe6993-5a3f-486e-90e7-1c54a4e846ec-kube-api-access-tqs7h\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.871008 4707 scope.go:117] "RemoveContainer" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" Feb 18 06:08:33 crc kubenswrapper[4707]: E0218 06:08:33.871436 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15\": container with ID starting with 2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15 not found: ID does not exist" containerID="2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15" Feb 18 06:08:33 crc kubenswrapper[4707]: I0218 06:08:33.871465 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15"} err="failed to get container status \"2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15\": rpc error: code = NotFound desc = could not find container \"2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15\": container with ID starting with 2c5254915092e4aa5b3bd38032257bcb1f242cd18b8f259deb6dc5950ceabb15 not found: ID does not exist" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.064329 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.070135 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.098641 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:34 crc kubenswrapper[4707]: E0218 06:08:34.099148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.099170 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.099373 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" containerName="nova-cell0-conductor-conductor" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.100135 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.103117 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-l9hgj" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.103329 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.108228 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.232121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lr46\" (UniqueName: \"kubernetes.io/projected/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-kube-api-access-2lr46\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.232544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.232597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.334129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lr46\" (UniqueName: \"kubernetes.io/projected/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-kube-api-access-2lr46\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.334233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.334278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.338440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.343257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.354389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lr46\" (UniqueName: \"kubernetes.io/projected/e596f7ea-65d0-41e8-8469-bf3aace5ed9a-kube-api-access-2lr46\") pod \"nova-cell0-conductor-0\" (UID: \"e596f7ea-65d0-41e8-8469-bf3aace5ed9a\") " pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.418786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:34 crc kubenswrapper[4707]: I0218 06:08:34.995005 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 06:08:34 crc kubenswrapper[4707]: W0218 06:08:34.996978 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode596f7ea_65d0_41e8_8469_bf3aace5ed9a.slice/crio-16cedb5d9abccf182543364849894cf5cd9f9245752c777b2d0e84bcfc27e469 WatchSource:0}: Error finding container 16cedb5d9abccf182543364849894cf5cd9f9245752c777b2d0e84bcfc27e469: Status 404 returned error can't find the container with id 16cedb5d9abccf182543364849894cf5cd9f9245752c777b2d0e84bcfc27e469 Feb 18 06:08:35 crc kubenswrapper[4707]: I0218 06:08:35.676954 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 06:08:35 crc kubenswrapper[4707]: I0218 06:08:35.749777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e596f7ea-65d0-41e8-8469-bf3aace5ed9a","Type":"ContainerStarted","Data":"ee3c94348198daf3a6f080e76c4d0bc626050b584d5d1bf03b2b83ef42e04bc9"} Feb 18 06:08:35 crc kubenswrapper[4707]: I0218 06:08:35.749887 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e596f7ea-65d0-41e8-8469-bf3aace5ed9a","Type":"ContainerStarted","Data":"16cedb5d9abccf182543364849894cf5cd9f9245752c777b2d0e84bcfc27e469"} Feb 18 06:08:35 crc kubenswrapper[4707]: I0218 06:08:35.750562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:36 crc kubenswrapper[4707]: I0218 06:08:36.067321 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fe6993-5a3f-486e-90e7-1c54a4e846ec" path="/var/lib/kubelet/pods/c3fe6993-5a3f-486e-90e7-1c54a4e846ec/volumes" Feb 18 06:08:39 crc kubenswrapper[4707]: I0218 06:08:39.408131 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=5.408096959 podStartE2EDuration="5.408096959s" podCreationTimestamp="2026-02-18 06:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:35.776113347 +0000 UTC m=+1252.424072491" watchObservedRunningTime="2026-02-18 06:08:39.408096959 +0000 UTC m=+1256.056056093" Feb 18 06:08:39 crc kubenswrapper[4707]: I0218 06:08:39.415046 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:39 crc kubenswrapper[4707]: I0218 06:08:39.415267 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" containerName="kube-state-metrics" containerID="cri-o://df83f4762c58d675d988731d1991ae0c95a0096a3656759e571959a4314351b3" gracePeriod=30 Feb 18 06:08:39 crc kubenswrapper[4707]: I0218 06:08:39.791366 4707 generic.go:334] "Generic (PLEG): container finished" podID="5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" containerID="df83f4762c58d675d988731d1991ae0c95a0096a3656759e571959a4314351b3" exitCode=2 Feb 18 06:08:39 crc kubenswrapper[4707]: I0218 06:08:39.791494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b","Type":"ContainerDied","Data":"df83f4762c58d675d988731d1991ae0c95a0096a3656759e571959a4314351b3"} Feb 18 06:08:39 crc kubenswrapper[4707]: I0218 06:08:39.999169 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.148463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r24j6\" (UniqueName: \"kubernetes.io/projected/5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b-kube-api-access-r24j6\") pod \"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b\" (UID: \"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b\") " Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.158747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b-kube-api-access-r24j6" (OuterVolumeSpecName: "kube-api-access-r24j6") pod "5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" (UID: "5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b"). InnerVolumeSpecName "kube-api-access-r24j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.251344 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r24j6\" (UniqueName: \"kubernetes.io/projected/5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b-kube-api-access-r24j6\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.801434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b","Type":"ContainerDied","Data":"e07c5e7a395476362faae959d8c903459c219e7c9cf988eb65a6d5df12447329"} Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.801491 4707 scope.go:117] "RemoveContainer" containerID="df83f4762c58d675d988731d1991ae0c95a0096a3656759e571959a4314351b3" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.801507 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.844412 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.853475 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.865913 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:40 crc kubenswrapper[4707]: E0218 06:08:40.866413 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" containerName="kube-state-metrics" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.866430 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" containerName="kube-state-metrics" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.866610 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" containerName="kube-state-metrics" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.868719 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.872170 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.872224 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.878135 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.964629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.964812 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.964846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:40 crc kubenswrapper[4707]: I0218 06:08:40.964894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbt4z\" (UniqueName: \"kubernetes.io/projected/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-api-access-xbt4z\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.067154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbt4z\" (UniqueName: \"kubernetes.io/projected/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-api-access-xbt4z\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.067241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.067377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.067439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.072234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.072485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.073648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.088572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbt4z\" (UniqueName: \"kubernetes.io/projected/e2c446d7-5c5f-40e6-831d-4c3e6c75d13d-kube-api-access-xbt4z\") pod \"kube-state-metrics-0\" (UID: \"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d\") " pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.226622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.299767 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.300084 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-central-agent" containerID="cri-o://9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8" gracePeriod=30 Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.300213 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="proxy-httpd" containerID="cri-o://ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74" gracePeriod=30 Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.300248 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="sg-core" containerID="cri-o://58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7" gracePeriod=30 Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.300277 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-notification-agent" containerID="cri-o://3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8" gracePeriod=30 Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.798533 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.828775 4707 generic.go:334] "Generic (PLEG): container finished" podID="91fb305a-2ad7-4136-9378-38efe97fd482" containerID="ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74" exitCode=0 Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.829749 4707 generic.go:334] "Generic (PLEG): container finished" podID="91fb305a-2ad7-4136-9378-38efe97fd482" containerID="58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7" exitCode=2 Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.829766 4707 generic.go:334] "Generic (PLEG): container finished" podID="91fb305a-2ad7-4136-9378-38efe97fd482" containerID="9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8" exitCode=0 Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.829818 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerDied","Data":"ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74"} Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.829842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerDied","Data":"58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7"} Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.829851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerDied","Data":"9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8"} Feb 18 06:08:41 crc kubenswrapper[4707]: I0218 06:08:41.839353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d","Type":"ContainerStarted","Data":"8c8e01c86957d656a71e96ead0f3dc15d5b0dd5ca7ba9adaff1440e236078d5f"} Feb 18 06:08:42 crc kubenswrapper[4707]: I0218 06:08:42.065895 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" path="/var/lib/kubelet/pods/5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b/volumes" Feb 18 06:08:42 crc kubenswrapper[4707]: I0218 06:08:42.857641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2c446d7-5c5f-40e6-831d-4c3e6c75d13d","Type":"ContainerStarted","Data":"5ee33cdaf1564583c236a3d059246da8ae2ebda20f968a99cdf108902cc1374f"} Feb 18 06:08:42 crc kubenswrapper[4707]: I0218 06:08:42.858432 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 06:08:42 crc kubenswrapper[4707]: I0218 06:08:42.888761 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.5048276769999998 podStartE2EDuration="2.888730677s" podCreationTimestamp="2026-02-18 06:08:40 +0000 UTC" firstStartedPulling="2026-02-18 06:08:41.808971829 +0000 UTC m=+1258.456930963" lastFinishedPulling="2026-02-18 06:08:42.192874829 +0000 UTC m=+1258.840833963" observedRunningTime="2026-02-18 06:08:42.876392696 +0000 UTC m=+1259.524351860" watchObservedRunningTime="2026-02-18 06:08:42.888730677 +0000 UTC m=+1259.536689841" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.832651 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.867757 4707 generic.go:334] "Generic (PLEG): container finished" podID="91fb305a-2ad7-4136-9378-38efe97fd482" containerID="3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8" exitCode=0 Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.867834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerDied","Data":"3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8"} Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.867875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.867905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91fb305a-2ad7-4136-9378-38efe97fd482","Type":"ContainerDied","Data":"239eb4c45e5fe8bc8435c0956c40a931784c1e96e1cbb670cbcaa770856f73c4"} Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.867949 4707 scope.go:117] "RemoveContainer" containerID="ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.898683 4707 scope.go:117] "RemoveContainer" containerID="58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.919078 4707 scope.go:117] "RemoveContainer" containerID="3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.941747 4707 scope.go:117] "RemoveContainer" containerID="9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.966069 4707 scope.go:117] "RemoveContainer" containerID="ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74" Feb 18 06:08:43 crc kubenswrapper[4707]: E0218 06:08:43.966609 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74\": container with ID starting with ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74 not found: ID does not exist" containerID="ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.966660 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74"} err="failed to get container status \"ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74\": rpc error: code = NotFound desc = could not find container \"ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74\": container with ID starting with ec81a17403b2ff97163a68b67d8e39facd60b34dfd440a9a5cc4f34bc0575e74 not found: ID does not exist" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.966698 4707 scope.go:117] "RemoveContainer" containerID="58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7" Feb 18 06:08:43 crc kubenswrapper[4707]: E0218 06:08:43.967135 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7\": container with ID starting with 58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7 not found: ID does not exist" containerID="58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.967164 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7"} err="failed to get container status \"58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7\": rpc error: code = NotFound desc = could not find container \"58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7\": container with ID starting with 58cc46f86dd857ee202dbf6586f22489506effed63bdb911af6befedf3c726a7 not found: ID does not exist" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.967181 4707 scope.go:117] "RemoveContainer" containerID="3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8" Feb 18 06:08:43 crc kubenswrapper[4707]: E0218 06:08:43.967402 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8\": container with ID starting with 3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8 not found: ID does not exist" containerID="3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.967429 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8"} err="failed to get container status \"3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8\": rpc error: code = NotFound desc = could not find container \"3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8\": container with ID starting with 3c600017db859138791e5f76171783b9bbfda5094dcaef906483d535614881c8 not found: ID does not exist" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.967446 4707 scope.go:117] "RemoveContainer" containerID="9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8" Feb 18 06:08:43 crc kubenswrapper[4707]: E0218 06:08:43.967664 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8\": container with ID starting with 9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8 not found: ID does not exist" containerID="9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.967688 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8"} err="failed to get container status \"9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8\": rpc error: code = NotFound desc = could not find container \"9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8\": container with ID starting with 9bd37f5e7e0229701579432ad2f4ad1152662acbddfcaf07b221fa42adf92ed8 not found: ID does not exist" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.976553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkhhs\" (UniqueName: \"kubernetes.io/projected/91fb305a-2ad7-4136-9378-38efe97fd482-kube-api-access-zkhhs\") pod \"91fb305a-2ad7-4136-9378-38efe97fd482\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.976615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-combined-ca-bundle\") pod \"91fb305a-2ad7-4136-9378-38efe97fd482\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.976643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-scripts\") pod \"91fb305a-2ad7-4136-9378-38efe97fd482\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.976677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-config-data\") pod \"91fb305a-2ad7-4136-9378-38efe97fd482\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.976720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-log-httpd\") pod \"91fb305a-2ad7-4136-9378-38efe97fd482\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.976750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-run-httpd\") pod \"91fb305a-2ad7-4136-9378-38efe97fd482\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.976831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-sg-core-conf-yaml\") pod \"91fb305a-2ad7-4136-9378-38efe97fd482\" (UID: \"91fb305a-2ad7-4136-9378-38efe97fd482\") " Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.977787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "91fb305a-2ad7-4136-9378-38efe97fd482" (UID: "91fb305a-2ad7-4136-9378-38efe97fd482"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.979100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "91fb305a-2ad7-4136-9378-38efe97fd482" (UID: "91fb305a-2ad7-4136-9378-38efe97fd482"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.985172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-scripts" (OuterVolumeSpecName: "scripts") pod "91fb305a-2ad7-4136-9378-38efe97fd482" (UID: "91fb305a-2ad7-4136-9378-38efe97fd482"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:43 crc kubenswrapper[4707]: I0218 06:08:43.985996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fb305a-2ad7-4136-9378-38efe97fd482-kube-api-access-zkhhs" (OuterVolumeSpecName: "kube-api-access-zkhhs") pod "91fb305a-2ad7-4136-9378-38efe97fd482" (UID: "91fb305a-2ad7-4136-9378-38efe97fd482"). InnerVolumeSpecName "kube-api-access-zkhhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.015177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "91fb305a-2ad7-4136-9378-38efe97fd482" (UID: "91fb305a-2ad7-4136-9378-38efe97fd482"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.067308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91fb305a-2ad7-4136-9378-38efe97fd482" (UID: "91fb305a-2ad7-4136-9378-38efe97fd482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.078414 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.078819 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkhhs\" (UniqueName: \"kubernetes.io/projected/91fb305a-2ad7-4136-9378-38efe97fd482-kube-api-access-zkhhs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.078898 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.078954 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.079016 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.079069 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91fb305a-2ad7-4136-9378-38efe97fd482-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.124171 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-config-data" (OuterVolumeSpecName: "config-data") pod "91fb305a-2ad7-4136-9378-38efe97fd482" (UID: "91fb305a-2ad7-4136-9378-38efe97fd482"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.193231 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fb305a-2ad7-4136-9378-38efe97fd482-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.267068 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.280713 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.291826 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:44 crc kubenswrapper[4707]: E0218 06:08:44.292391 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-central-agent" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-central-agent" Feb 18 06:08:44 crc kubenswrapper[4707]: E0218 06:08:44.292426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="sg-core" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292432 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="sg-core" Feb 18 06:08:44 crc kubenswrapper[4707]: E0218 06:08:44.292462 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="proxy-httpd" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292471 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="proxy-httpd" Feb 18 06:08:44 crc kubenswrapper[4707]: E0218 06:08:44.292492 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-notification-agent" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292500 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-notification-agent" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292790 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-central-agent" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292831 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="sg-core" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="proxy-httpd" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.292898 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" containerName="ceilometer-notification-agent" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.295349 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.298418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.303476 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.303658 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.303911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.402923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-scripts\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.402978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-run-httpd\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.403125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.403194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-log-httpd\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.404025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2frp\" (UniqueName: \"kubernetes.io/projected/f0884a8f-f290-4556-9808-47963ef4cd51-kube-api-access-q2frp\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.404384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.404463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.404533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-config-data\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.447284 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.506418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.506812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.506857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-config-data\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.506926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-scripts\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.506942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-run-httpd\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.506986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.507013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-log-httpd\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.507179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2frp\" (UniqueName: \"kubernetes.io/projected/f0884a8f-f290-4556-9808-47963ef4cd51-kube-api-access-q2frp\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.507851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-log-httpd\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.511464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-run-httpd\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.514775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.516276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-scripts\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.516473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.517819 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.519894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-config-data\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.530987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2frp\" (UniqueName: \"kubernetes.io/projected/f0884a8f-f290-4556-9808-47963ef4cd51-kube-api-access-q2frp\") pod \"ceilometer-0\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.646936 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.746395 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="5599a4a6-2d3b-4bab-a1f8-bc87c27f7b5b" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.960435 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gbm4m"] Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.962456 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.965225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.965405 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 18 06:08:44 crc kubenswrapper[4707]: I0218 06:08:44.969292 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gbm4m"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.022175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w4z\" (UniqueName: \"kubernetes.io/projected/7ac58976-c8d4-406d-abf2-055b212106d1-kube-api-access-h2w4z\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.022241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.022289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-scripts\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.022346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-config-data\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.124073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w4z\" (UniqueName: \"kubernetes.io/projected/7ac58976-c8d4-406d-abf2-055b212106d1-kube-api-access-h2w4z\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.124562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.124920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-scripts\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.125209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-config-data\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.134070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.136182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-scripts\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.147917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-config-data\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.162155 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.170596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w4z\" (UniqueName: \"kubernetes.io/projected/7ac58976-c8d4-406d-abf2-055b212106d1-kube-api-access-h2w4z\") pod \"nova-cell0-cell-mapping-gbm4m\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.183371 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.185131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.203444 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.236323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.292013 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.292629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.293941 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.304626 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.331130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhnz\" (UniqueName: \"kubernetes.io/projected/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-kube-api-access-cxhnz\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.331191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.331294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.372862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.432907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhnz\" (UniqueName: \"kubernetes.io/projected/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-kube-api-access-cxhnz\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.432968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.433058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.433107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-config-data\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.433148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xrn\" (UniqueName: \"kubernetes.io/projected/0b84cf9d-dc6d-4210-acd8-f383978d61d5-kube-api-access-w2xrn\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.433179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b84cf9d-dc6d-4210-acd8-f383978d61d5-logs\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.433199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.446639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.453511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.467581 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.477217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.487307 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.541146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b84cf9d-dc6d-4210-acd8-f383978d61d5-logs\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.541199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.541363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-config-data\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.541411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xrn\" (UniqueName: \"kubernetes.io/projected/0b84cf9d-dc6d-4210-acd8-f383978d61d5-kube-api-access-w2xrn\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.542118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b84cf9d-dc6d-4210-acd8-f383978d61d5-logs\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.543029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhnz\" (UniqueName: \"kubernetes.io/projected/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-kube-api-access-cxhnz\") pod \"nova-cell1-novncproxy-0\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.553641 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-config-data\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.555721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.566735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.630400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xrn\" (UniqueName: \"kubernetes.io/projected/0b84cf9d-dc6d-4210-acd8-f383978d61d5-kube-api-access-w2xrn\") pod \"nova-api-0\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.654312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.655761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4nh7\" (UniqueName: \"kubernetes.io/projected/50f94bf4-efd2-47dc-86e5-68c81538c19f-kube-api-access-d4nh7\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.655865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.655898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f94bf4-efd2-47dc-86e5-68c81538c19f-logs\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.655924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-config-data\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.758997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.759497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f94bf4-efd2-47dc-86e5-68c81538c19f-logs\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.759526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-config-data\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.759637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4nh7\" (UniqueName: \"kubernetes.io/projected/50f94bf4-efd2-47dc-86e5-68c81538c19f-kube-api-access-d4nh7\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.764322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f94bf4-efd2-47dc-86e5-68c81538c19f-logs\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.767556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-config-data\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.794512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.799584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4nh7\" (UniqueName: \"kubernetes.io/projected/50f94bf4-efd2-47dc-86e5-68c81538c19f-kube-api-access-d4nh7\") pod \"nova-metadata-0\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.821783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.848357 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.849557 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.851241 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.856434 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.884706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.921956 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b49846fdf-kstxv"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.936732 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.968966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b49846fdf-kstxv"] Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.971519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.971567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-kube-api-access-hpvkk\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:45 crc kubenswrapper[4707]: I0218 06:08:45.971655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-config-data\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.003016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerStarted","Data":"f482efa0adf1dde8bf86af08aaa6b1d7d8e2728913b959b5ac98b0b53d3d1c67"} Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-kube-api-access-hpvkk\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-svc\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-config-data\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgksd\" (UniqueName: \"kubernetes.io/projected/a3871817-d356-4035-8d85-e42993ddad4f-kube-api-access-zgksd\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.073404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-config\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.080347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-config-data\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.080954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.087989 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91fb305a-2ad7-4136-9378-38efe97fd482" path="/var/lib/kubelet/pods/91fb305a-2ad7-4136-9378-38efe97fd482/volumes" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.092683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-kube-api-access-hpvkk\") pod \"nova-scheduler-0\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.174908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.174958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-svc\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.175001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.175022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgksd\" (UniqueName: \"kubernetes.io/projected/a3871817-d356-4035-8d85-e42993ddad4f-kube-api-access-zgksd\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.175040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-config\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.175122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.175898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.176414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.177952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.178047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-svc\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.178167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-config\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.198154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgksd\" (UniqueName: \"kubernetes.io/projected/a3871817-d356-4035-8d85-e42993ddad4f-kube-api-access-zgksd\") pod \"dnsmasq-dns-7b49846fdf-kstxv\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.218780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.296932 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gbm4m"] Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.325746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.524677 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.597142 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nnl9x"] Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.598332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.600382 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.608362 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.629302 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nnl9x"] Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.691456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrsf8\" (UniqueName: \"kubernetes.io/projected/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-kube-api-access-zrsf8\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.691614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-scripts\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.691657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-config-data\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.691697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.703750 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.755546 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.797215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-config-data\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.797315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.797365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrsf8\" (UniqueName: \"kubernetes.io/projected/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-kube-api-access-zrsf8\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.797497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-scripts\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.806126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-scripts\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.806739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-config-data\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.807375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.824382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrsf8\" (UniqueName: \"kubernetes.io/projected/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-kube-api-access-zrsf8\") pod \"nova-cell1-conductor-db-sync-nnl9x\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.966567 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:46 crc kubenswrapper[4707]: I0218 06:08:46.971463 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:46 crc kubenswrapper[4707]: W0218 06:08:46.977250 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ec90c2_23aa_44b3_9086_fcc3f6c9a809.slice/crio-be6aa0003a8cf64ed0fb24cb6af7bc1a701eb8ab80bb428c5c5328534ac6993e WatchSource:0}: Error finding container be6aa0003a8cf64ed0fb24cb6af7bc1a701eb8ab80bb428c5c5328534ac6993e: Status 404 returned error can't find the container with id be6aa0003a8cf64ed0fb24cb6af7bc1a701eb8ab80bb428c5c5328534ac6993e Feb 18 06:08:47 crc kubenswrapper[4707]: W0218 06:08:47.004927 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3871817_d356_4035_8d85_e42993ddad4f.slice/crio-fca723528a1b402a9ace874216d737bfe00a68543be3ac381cf4c208a3d14262 WatchSource:0}: Error finding container fca723528a1b402a9ace874216d737bfe00a68543be3ac381cf4c208a3d14262: Status 404 returned error can't find the container with id fca723528a1b402a9ace874216d737bfe00a68543be3ac381cf4c208a3d14262 Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.037242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerStarted","Data":"daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.037675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerStarted","Data":"ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.040905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809","Type":"ContainerStarted","Data":"be6aa0003a8cf64ed0fb24cb6af7bc1a701eb8ab80bb428c5c5328534ac6993e"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.047764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8acba92d-11e1-467a-aa8e-9cc9078c5dc2","Type":"ContainerStarted","Data":"279467e5694bc782d595e526c4b3fe39ef47712192ca6f60f6107e5d4f262267"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.050472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"50f94bf4-efd2-47dc-86e5-68c81538c19f","Type":"ContainerStarted","Data":"9a4e85a558800fdd25d02dd0be92cf4c98286197bced6b2686ce24aac65ae9aa"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.051439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0b84cf9d-dc6d-4210-acd8-f383978d61d5","Type":"ContainerStarted","Data":"27f9d59c2ec2fe742ca7b91f7e55e59ba8d246aaed31e6e5e51b6b4ef8cf66ed"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.054972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gbm4m" event={"ID":"7ac58976-c8d4-406d-abf2-055b212106d1","Type":"ContainerStarted","Data":"e4b127bfa70a9eae84101e91000f1fd1f05168390c49d6ed38a8bf7881f9e6c8"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.055016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gbm4m" event={"ID":"7ac58976-c8d4-406d-abf2-055b212106d1","Type":"ContainerStarted","Data":"3cd249988bac9eed5a103821bddd0b7fa35a9621c1af9503bcf307e4d766d083"} Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.082256 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b49846fdf-kstxv"] Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.095710 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gbm4m" podStartSLOduration=3.095684431 podStartE2EDuration="3.095684431s" podCreationTimestamp="2026-02-18 06:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:47.075122139 +0000 UTC m=+1263.723081273" watchObservedRunningTime="2026-02-18 06:08:47.095684431 +0000 UTC m=+1263.743643565" Feb 18 06:08:47 crc kubenswrapper[4707]: I0218 06:08:47.526758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nnl9x"] Feb 18 06:08:48 crc kubenswrapper[4707]: I0218 06:08:48.082641 4707 generic.go:334] "Generic (PLEG): container finished" podID="a3871817-d356-4035-8d85-e42993ddad4f" containerID="10d73319837d994e64db6b38221e2d65319271b8480a72f7da72676923a91e5e" exitCode=0 Feb 18 06:08:48 crc kubenswrapper[4707]: I0218 06:08:48.083172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" event={"ID":"a3871817-d356-4035-8d85-e42993ddad4f","Type":"ContainerDied","Data":"10d73319837d994e64db6b38221e2d65319271b8480a72f7da72676923a91e5e"} Feb 18 06:08:48 crc kubenswrapper[4707]: I0218 06:08:48.083214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" event={"ID":"a3871817-d356-4035-8d85-e42993ddad4f","Type":"ContainerStarted","Data":"fca723528a1b402a9ace874216d737bfe00a68543be3ac381cf4c208a3d14262"} Feb 18 06:08:48 crc kubenswrapper[4707]: I0218 06:08:48.087841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" event={"ID":"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a","Type":"ContainerStarted","Data":"90dcd35fe507051c541b71b419ed8fd7f40936d24218c93090ff354907cea00f"} Feb 18 06:08:48 crc kubenswrapper[4707]: I0218 06:08:48.087883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" event={"ID":"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a","Type":"ContainerStarted","Data":"01067c2b9a52a0567e711fee8b630935639c0f443cd70fb7303b8f184cc5ec64"} Feb 18 06:08:48 crc kubenswrapper[4707]: I0218 06:08:48.117365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerStarted","Data":"75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e"} Feb 18 06:08:48 crc kubenswrapper[4707]: I0218 06:08:48.131226 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" podStartSLOduration=2.131214102 podStartE2EDuration="2.131214102s" podCreationTimestamp="2026-02-18 06:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:48.127337438 +0000 UTC m=+1264.775296572" watchObservedRunningTime="2026-02-18 06:08:48.131214102 +0000 UTC m=+1264.779173236" Feb 18 06:08:49 crc kubenswrapper[4707]: I0218 06:08:49.135173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" event={"ID":"a3871817-d356-4035-8d85-e42993ddad4f","Type":"ContainerStarted","Data":"25c61f62abd4d903048f3081d70011f274201fe023c202212ab7665591c54261"} Feb 18 06:08:49 crc kubenswrapper[4707]: I0218 06:08:49.135576 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:49 crc kubenswrapper[4707]: I0218 06:08:49.167778 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" podStartSLOduration=4.16775599 podStartE2EDuration="4.16775599s" podCreationTimestamp="2026-02-18 06:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:49.157576757 +0000 UTC m=+1265.805535881" watchObservedRunningTime="2026-02-18 06:08:49.16775599 +0000 UTC m=+1265.815715124" Feb 18 06:08:49 crc kubenswrapper[4707]: I0218 06:08:49.279269 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:49 crc kubenswrapper[4707]: I0218 06:08:49.322166 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:08:51 crc kubenswrapper[4707]: I0218 06:08:51.243825 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 06:08:51 crc kubenswrapper[4707]: I0218 06:08:51.382934 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:08:51 crc kubenswrapper[4707]: I0218 06:08:51.383038 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.187969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0b84cf9d-dc6d-4210-acd8-f383978d61d5","Type":"ContainerStarted","Data":"dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4"} Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.188343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0b84cf9d-dc6d-4210-acd8-f383978d61d5","Type":"ContainerStarted","Data":"e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51"} Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.195822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerStarted","Data":"0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a"} Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.195971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.197248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809","Type":"ContainerStarted","Data":"4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505"} Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.198749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"50f94bf4-efd2-47dc-86e5-68c81538c19f","Type":"ContainerStarted","Data":"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af"} Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.198803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"50f94bf4-efd2-47dc-86e5-68c81538c19f","Type":"ContainerStarted","Data":"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029"} Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.198830 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-log" containerID="cri-o://14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029" gracePeriod=30 Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.198863 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-metadata" containerID="cri-o://c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af" gracePeriod=30 Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.201076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8acba92d-11e1-467a-aa8e-9cc9078c5dc2","Type":"ContainerStarted","Data":"e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a"} Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.201176 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8acba92d-11e1-467a-aa8e-9cc9078c5dc2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a" gracePeriod=30 Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.208379 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.72005847 podStartE2EDuration="7.208362411s" podCreationTimestamp="2026-02-18 06:08:45 +0000 UTC" firstStartedPulling="2026-02-18 06:08:46.621078595 +0000 UTC m=+1263.269037729" lastFinishedPulling="2026-02-18 06:08:51.109382536 +0000 UTC m=+1267.757341670" observedRunningTime="2026-02-18 06:08:52.204211059 +0000 UTC m=+1268.852170193" watchObservedRunningTime="2026-02-18 06:08:52.208362411 +0000 UTC m=+1268.856321545" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.232852 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.296396174 podStartE2EDuration="8.232833037s" podCreationTimestamp="2026-02-18 06:08:44 +0000 UTC" firstStartedPulling="2026-02-18 06:08:45.182233592 +0000 UTC m=+1261.830192726" lastFinishedPulling="2026-02-18 06:08:51.118670455 +0000 UTC m=+1267.766629589" observedRunningTime="2026-02-18 06:08:52.231325647 +0000 UTC m=+1268.879284781" watchObservedRunningTime="2026-02-18 06:08:52.232833037 +0000 UTC m=+1268.880792171" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.251177 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8992243220000002 podStartE2EDuration="7.251148669s" podCreationTimestamp="2026-02-18 06:08:45 +0000 UTC" firstStartedPulling="2026-02-18 06:08:46.764006644 +0000 UTC m=+1263.411965778" lastFinishedPulling="2026-02-18 06:08:51.115930991 +0000 UTC m=+1267.763890125" observedRunningTime="2026-02-18 06:08:52.248470467 +0000 UTC m=+1268.896429601" watchObservedRunningTime="2026-02-18 06:08:52.251148669 +0000 UTC m=+1268.899107803" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.284737 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.965004468 podStartE2EDuration="7.284712461s" podCreationTimestamp="2026-02-18 06:08:45 +0000 UTC" firstStartedPulling="2026-02-18 06:08:46.783933309 +0000 UTC m=+1263.431892433" lastFinishedPulling="2026-02-18 06:08:51.103641292 +0000 UTC m=+1267.751600426" observedRunningTime="2026-02-18 06:08:52.264154158 +0000 UTC m=+1268.912113292" watchObservedRunningTime="2026-02-18 06:08:52.284712461 +0000 UTC m=+1268.932671595" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.295502 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.173907709 podStartE2EDuration="7.29547474s" podCreationTimestamp="2026-02-18 06:08:45 +0000 UTC" firstStartedPulling="2026-02-18 06:08:46.987784844 +0000 UTC m=+1263.635743968" lastFinishedPulling="2026-02-18 06:08:51.109351865 +0000 UTC m=+1267.757310999" observedRunningTime="2026-02-18 06:08:52.286399116 +0000 UTC m=+1268.934358250" watchObservedRunningTime="2026-02-18 06:08:52.29547474 +0000 UTC m=+1268.943433874" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.877635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.946025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4nh7\" (UniqueName: \"kubernetes.io/projected/50f94bf4-efd2-47dc-86e5-68c81538c19f-kube-api-access-d4nh7\") pod \"50f94bf4-efd2-47dc-86e5-68c81538c19f\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.946181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-combined-ca-bundle\") pod \"50f94bf4-efd2-47dc-86e5-68c81538c19f\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.946305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f94bf4-efd2-47dc-86e5-68c81538c19f-logs\") pod \"50f94bf4-efd2-47dc-86e5-68c81538c19f\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.946350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-config-data\") pod \"50f94bf4-efd2-47dc-86e5-68c81538c19f\" (UID: \"50f94bf4-efd2-47dc-86e5-68c81538c19f\") " Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.946809 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f94bf4-efd2-47dc-86e5-68c81538c19f-logs" (OuterVolumeSpecName: "logs") pod "50f94bf4-efd2-47dc-86e5-68c81538c19f" (UID: "50f94bf4-efd2-47dc-86e5-68c81538c19f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.957100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f94bf4-efd2-47dc-86e5-68c81538c19f-kube-api-access-d4nh7" (OuterVolumeSpecName: "kube-api-access-d4nh7") pod "50f94bf4-efd2-47dc-86e5-68c81538c19f" (UID: "50f94bf4-efd2-47dc-86e5-68c81538c19f"). InnerVolumeSpecName "kube-api-access-d4nh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.978738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50f94bf4-efd2-47dc-86e5-68c81538c19f" (UID: "50f94bf4-efd2-47dc-86e5-68c81538c19f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:52 crc kubenswrapper[4707]: I0218 06:08:52.983471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-config-data" (OuterVolumeSpecName: "config-data") pod "50f94bf4-efd2-47dc-86e5-68c81538c19f" (UID: "50f94bf4-efd2-47dc-86e5-68c81538c19f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.048862 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.049094 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f94bf4-efd2-47dc-86e5-68c81538c19f-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.049189 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f94bf4-efd2-47dc-86e5-68c81538c19f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.049255 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4nh7\" (UniqueName: \"kubernetes.io/projected/50f94bf4-efd2-47dc-86e5-68c81538c19f-kube-api-access-d4nh7\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.219208 4707 generic.go:334] "Generic (PLEG): container finished" podID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerID="c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af" exitCode=0 Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.219259 4707 generic.go:334] "Generic (PLEG): container finished" podID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerID="14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029" exitCode=143 Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.219517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"50f94bf4-efd2-47dc-86e5-68c81538c19f","Type":"ContainerDied","Data":"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af"} Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.219567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"50f94bf4-efd2-47dc-86e5-68c81538c19f","Type":"ContainerDied","Data":"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029"} Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.219582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"50f94bf4-efd2-47dc-86e5-68c81538c19f","Type":"ContainerDied","Data":"9a4e85a558800fdd25d02dd0be92cf4c98286197bced6b2686ce24aac65ae9aa"} Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.219606 4707 scope.go:117] "RemoveContainer" containerID="c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.220142 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.253237 4707 scope.go:117] "RemoveContainer" containerID="14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.275862 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.293431 4707 scope.go:117] "RemoveContainer" containerID="c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af" Feb 18 06:08:53 crc kubenswrapper[4707]: E0218 06:08:53.294780 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af\": container with ID starting with c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af not found: ID does not exist" containerID="c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.294912 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af"} err="failed to get container status \"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af\": rpc error: code = NotFound desc = could not find container \"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af\": container with ID starting with c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af not found: ID does not exist" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.294942 4707 scope.go:117] "RemoveContainer" containerID="14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.295035 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:53 crc kubenswrapper[4707]: E0218 06:08:53.295950 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029\": container with ID starting with 14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029 not found: ID does not exist" containerID="14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.296063 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029"} err="failed to get container status \"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029\": rpc error: code = NotFound desc = could not find container \"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029\": container with ID starting with 14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029 not found: ID does not exist" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.296139 4707 scope.go:117] "RemoveContainer" containerID="c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.296490 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af"} err="failed to get container status \"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af\": rpc error: code = NotFound desc = could not find container \"c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af\": container with ID starting with c5210d97df4b15c6c2cbd863c7fd773e65ddc6e253908ed60f48ce85147ba8af not found: ID does not exist" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.296513 4707 scope.go:117] "RemoveContainer" containerID="14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.296761 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029"} err="failed to get container status \"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029\": rpc error: code = NotFound desc = could not find container \"14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029\": container with ID starting with 14656289d676c24a91445303d67361181ef5adbdefbbdf037718dbce8eb80029 not found: ID does not exist" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.311962 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:53 crc kubenswrapper[4707]: E0218 06:08:53.312750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-metadata" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.312778 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-metadata" Feb 18 06:08:53 crc kubenswrapper[4707]: E0218 06:08:53.312825 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-log" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.312834 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-log" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.313149 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-metadata" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.313184 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" containerName="nova-metadata-log" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.314854 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.321550 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.321755 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.321958 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.467086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.467247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.467383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fddf93f-f494-44f8-aed3-231f9e1b84a8-logs\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.467631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfkpb\" (UniqueName: \"kubernetes.io/projected/0fddf93f-f494-44f8-aed3-231f9e1b84a8-kube-api-access-hfkpb\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.467749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-config-data\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.570327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.570656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.570761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fddf93f-f494-44f8-aed3-231f9e1b84a8-logs\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.570920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkpb\" (UniqueName: \"kubernetes.io/projected/0fddf93f-f494-44f8-aed3-231f9e1b84a8-kube-api-access-hfkpb\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.571824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-config-data\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.572953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fddf93f-f494-44f8-aed3-231f9e1b84a8-logs\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.577773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.579439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.585113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-config-data\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.596377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfkpb\" (UniqueName: \"kubernetes.io/projected/0fddf93f-f494-44f8-aed3-231f9e1b84a8-kube-api-access-hfkpb\") pod \"nova-metadata-0\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " pod="openstack/nova-metadata-0" Feb 18 06:08:53 crc kubenswrapper[4707]: I0218 06:08:53.644285 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:54 crc kubenswrapper[4707]: I0218 06:08:54.066562 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f94bf4-efd2-47dc-86e5-68c81538c19f" path="/var/lib/kubelet/pods/50f94bf4-efd2-47dc-86e5-68c81538c19f/volumes" Feb 18 06:08:54 crc kubenswrapper[4707]: I0218 06:08:54.183077 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:54 crc kubenswrapper[4707]: I0218 06:08:54.234281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fddf93f-f494-44f8-aed3-231f9e1b84a8","Type":"ContainerStarted","Data":"13614d955c155e7520ffb715532626484e9914abf6892cb8586c6194429c9669"} Feb 18 06:08:55 crc kubenswrapper[4707]: I0218 06:08:55.247517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fddf93f-f494-44f8-aed3-231f9e1b84a8","Type":"ContainerStarted","Data":"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6"} Feb 18 06:08:55 crc kubenswrapper[4707]: I0218 06:08:55.248236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fddf93f-f494-44f8-aed3-231f9e1b84a8","Type":"ContainerStarted","Data":"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101"} Feb 18 06:08:55 crc kubenswrapper[4707]: I0218 06:08:55.655138 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:08:55 crc kubenswrapper[4707]: I0218 06:08:55.655204 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:08:55 crc kubenswrapper[4707]: I0218 06:08:55.822904 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.220036 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.220496 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.261380 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ac58976-c8d4-406d-abf2-055b212106d1" containerID="e4b127bfa70a9eae84101e91000f1fd1f05168390c49d6ed38a8bf7881f9e6c8" exitCode=0 Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.261458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gbm4m" event={"ID":"7ac58976-c8d4-406d-abf2-055b212106d1","Type":"ContainerDied","Data":"e4b127bfa70a9eae84101e91000f1fd1f05168390c49d6ed38a8bf7881f9e6c8"} Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.262741 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.263703 4707 generic.go:334] "Generic (PLEG): container finished" podID="693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" containerID="90dcd35fe507051c541b71b419ed8fd7f40936d24218c93090ff354907cea00f" exitCode=0 Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.264971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" event={"ID":"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a","Type":"ContainerDied","Data":"90dcd35fe507051c541b71b419ed8fd7f40936d24218c93090ff354907cea00f"} Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.287096 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.287075472 podStartE2EDuration="3.287075472s" podCreationTimestamp="2026-02-18 06:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:08:55.272898694 +0000 UTC m=+1271.920857848" watchObservedRunningTime="2026-02-18 06:08:56.287075472 +0000 UTC m=+1272.935034626" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.327920 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.328010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.426822 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dcc9fdf5-4rnck"] Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.427167 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" podUID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerName="dnsmasq-dns" containerID="cri-o://767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b" gracePeriod=10 Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.739035 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:56 crc kubenswrapper[4707]: I0218 06:08:56.739042 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.021939 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.166773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-sb\") pod \"3621d2e8-f3b4-41d6-9386-e151527d23d3\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.166857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-nb\") pod \"3621d2e8-f3b4-41d6-9386-e151527d23d3\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.167025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-swift-storage-0\") pod \"3621d2e8-f3b4-41d6-9386-e151527d23d3\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.167115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-svc\") pod \"3621d2e8-f3b4-41d6-9386-e151527d23d3\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.167134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-config\") pod \"3621d2e8-f3b4-41d6-9386-e151527d23d3\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.167704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4l6f\" (UniqueName: \"kubernetes.io/projected/3621d2e8-f3b4-41d6-9386-e151527d23d3-kube-api-access-f4l6f\") pod \"3621d2e8-f3b4-41d6-9386-e151527d23d3\" (UID: \"3621d2e8-f3b4-41d6-9386-e151527d23d3\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.176690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3621d2e8-f3b4-41d6-9386-e151527d23d3-kube-api-access-f4l6f" (OuterVolumeSpecName: "kube-api-access-f4l6f") pod "3621d2e8-f3b4-41d6-9386-e151527d23d3" (UID: "3621d2e8-f3b4-41d6-9386-e151527d23d3"). InnerVolumeSpecName "kube-api-access-f4l6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.251577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3621d2e8-f3b4-41d6-9386-e151527d23d3" (UID: "3621d2e8-f3b4-41d6-9386-e151527d23d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.255863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3621d2e8-f3b4-41d6-9386-e151527d23d3" (UID: "3621d2e8-f3b4-41d6-9386-e151527d23d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.256926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3621d2e8-f3b4-41d6-9386-e151527d23d3" (UID: "3621d2e8-f3b4-41d6-9386-e151527d23d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.259331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-config" (OuterVolumeSpecName: "config") pod "3621d2e8-f3b4-41d6-9386-e151527d23d3" (UID: "3621d2e8-f3b4-41d6-9386-e151527d23d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.269006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3621d2e8-f3b4-41d6-9386-e151527d23d3" (UID: "3621d2e8-f3b4-41d6-9386-e151527d23d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.276918 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.276952 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.276966 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.276978 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.276986 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3621d2e8-f3b4-41d6-9386-e151527d23d3-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.276995 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4l6f\" (UniqueName: \"kubernetes.io/projected/3621d2e8-f3b4-41d6-9386-e151527d23d3-kube-api-access-f4l6f\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.281241 4707 generic.go:334] "Generic (PLEG): container finished" podID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerID="767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b" exitCode=0 Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.281913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.282172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" event={"ID":"3621d2e8-f3b4-41d6-9386-e151527d23d3","Type":"ContainerDied","Data":"767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b"} Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.282211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcc9fdf5-4rnck" event={"ID":"3621d2e8-f3b4-41d6-9386-e151527d23d3","Type":"ContainerDied","Data":"09a47cbf51f4c266240e4a1a4a43b92404f7d1217648dc6c21e28a3241a4d599"} Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.282228 4707 scope.go:117] "RemoveContainer" containerID="767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.326773 4707 scope.go:117] "RemoveContainer" containerID="5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.326922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dcc9fdf5-4rnck"] Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.338045 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9dcc9fdf5-4rnck"] Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.361701 4707 scope.go:117] "RemoveContainer" containerID="767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b" Feb 18 06:08:57 crc kubenswrapper[4707]: E0218 06:08:57.364214 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b\": container with ID starting with 767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b not found: ID does not exist" containerID="767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.364249 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b"} err="failed to get container status \"767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b\": rpc error: code = NotFound desc = could not find container \"767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b\": container with ID starting with 767c5a6c6052e95323023eb9c7652d5a68aede4b99d471b9a65e379e8789c70b not found: ID does not exist" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.364291 4707 scope.go:117] "RemoveContainer" containerID="5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986" Feb 18 06:08:57 crc kubenswrapper[4707]: E0218 06:08:57.368197 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986\": container with ID starting with 5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986 not found: ID does not exist" containerID="5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.368254 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986"} err="failed to get container status \"5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986\": rpc error: code = NotFound desc = could not find container \"5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986\": container with ID starting with 5ef15044ad6b1b324a5de6b267db8f09d0da69376f56fd89a0e330c6d81d5986 not found: ID does not exist" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.777207 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.786850 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.895662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-combined-ca-bundle\") pod \"7ac58976-c8d4-406d-abf2-055b212106d1\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.895741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-config-data\") pod \"7ac58976-c8d4-406d-abf2-055b212106d1\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.895833 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2w4z\" (UniqueName: \"kubernetes.io/projected/7ac58976-c8d4-406d-abf2-055b212106d1-kube-api-access-h2w4z\") pod \"7ac58976-c8d4-406d-abf2-055b212106d1\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.895935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-combined-ca-bundle\") pod \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.896000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-config-data\") pod \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.896024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrsf8\" (UniqueName: \"kubernetes.io/projected/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-kube-api-access-zrsf8\") pod \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.896053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-scripts\") pod \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\" (UID: \"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.896244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-scripts\") pod \"7ac58976-c8d4-406d-abf2-055b212106d1\" (UID: \"7ac58976-c8d4-406d-abf2-055b212106d1\") " Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.904149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac58976-c8d4-406d-abf2-055b212106d1-kube-api-access-h2w4z" (OuterVolumeSpecName: "kube-api-access-h2w4z") pod "7ac58976-c8d4-406d-abf2-055b212106d1" (UID: "7ac58976-c8d4-406d-abf2-055b212106d1"). InnerVolumeSpecName "kube-api-access-h2w4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.904206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-kube-api-access-zrsf8" (OuterVolumeSpecName: "kube-api-access-zrsf8") pod "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" (UID: "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a"). InnerVolumeSpecName "kube-api-access-zrsf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.906234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-scripts" (OuterVolumeSpecName: "scripts") pod "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" (UID: "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.917045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-scripts" (OuterVolumeSpecName: "scripts") pod "7ac58976-c8d4-406d-abf2-055b212106d1" (UID: "7ac58976-c8d4-406d-abf2-055b212106d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.932381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-config-data" (OuterVolumeSpecName: "config-data") pod "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" (UID: "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.934038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" (UID: "693f7b53-7a5c-42a3-b5e7-9fc96d46b78a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.934617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ac58976-c8d4-406d-abf2-055b212106d1" (UID: "7ac58976-c8d4-406d-abf2-055b212106d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:57 crc kubenswrapper[4707]: I0218 06:08:57.956132 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-config-data" (OuterVolumeSpecName: "config-data") pod "7ac58976-c8d4-406d-abf2-055b212106d1" (UID: "7ac58976-c8d4-406d-abf2-055b212106d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014437 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014502 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014520 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2w4z\" (UniqueName: \"kubernetes.io/projected/7ac58976-c8d4-406d-abf2-055b212106d1-kube-api-access-h2w4z\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014541 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014559 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014573 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrsf8\" (UniqueName: \"kubernetes.io/projected/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-kube-api-access-zrsf8\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014586 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.014601 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac58976-c8d4-406d-abf2-055b212106d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.067626 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3621d2e8-f3b4-41d6-9386-e151527d23d3" path="/var/lib/kubelet/pods/3621d2e8-f3b4-41d6-9386-e151527d23d3/volumes" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.294512 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gbm4m" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.294503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gbm4m" event={"ID":"7ac58976-c8d4-406d-abf2-055b212106d1","Type":"ContainerDied","Data":"3cd249988bac9eed5a103821bddd0b7fa35a9621c1af9503bcf307e4d766d083"} Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.294785 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cd249988bac9eed5a103821bddd0b7fa35a9621c1af9503bcf307e4d766d083" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.297060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" event={"ID":"693f7b53-7a5c-42a3-b5e7-9fc96d46b78a","Type":"ContainerDied","Data":"01067c2b9a52a0567e711fee8b630935639c0f443cd70fb7303b8f184cc5ec64"} Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.297091 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01067c2b9a52a0567e711fee8b630935639c0f443cd70fb7303b8f184cc5ec64" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.297278 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nnl9x" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.399968 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 06:08:58 crc kubenswrapper[4707]: E0218 06:08:58.400781 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac58976-c8d4-406d-abf2-055b212106d1" containerName="nova-manage" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.400841 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac58976-c8d4-406d-abf2-055b212106d1" containerName="nova-manage" Feb 18 06:08:58 crc kubenswrapper[4707]: E0218 06:08:58.400882 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerName="dnsmasq-dns" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.400895 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerName="dnsmasq-dns" Feb 18 06:08:58 crc kubenswrapper[4707]: E0218 06:08:58.400955 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" containerName="nova-cell1-conductor-db-sync" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.400967 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" containerName="nova-cell1-conductor-db-sync" Feb 18 06:08:58 crc kubenswrapper[4707]: E0218 06:08:58.400988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerName="init" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.400999 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerName="init" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.401343 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3621d2e8-f3b4-41d6-9386-e151527d23d3" containerName="dnsmasq-dns" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.401384 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" containerName="nova-cell1-conductor-db-sync" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.401408 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac58976-c8d4-406d-abf2-055b212106d1" containerName="nova-manage" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.402909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.405764 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.414465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.496304 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.497193 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-log" containerID="cri-o://e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51" gracePeriod=30 Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.497340 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-api" containerID="cri-o://dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4" gracePeriod=30 Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.509967 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.510257 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" containerName="nova-scheduler-scheduler" containerID="cri-o://4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505" gracePeriod=30 Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.526022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btgfl\" (UniqueName: \"kubernetes.io/projected/95d9aeec-f182-49b1-9064-352e3bd2fe9b-kube-api-access-btgfl\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.526069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9aeec-f182-49b1-9064-352e3bd2fe9b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.526134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9aeec-f182-49b1-9064-352e3bd2fe9b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.593512 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.594005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-log" containerID="cri-o://ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101" gracePeriod=30 Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.594237 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-metadata" containerID="cri-o://2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6" gracePeriod=30 Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.627493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btgfl\" (UniqueName: \"kubernetes.io/projected/95d9aeec-f182-49b1-9064-352e3bd2fe9b-kube-api-access-btgfl\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.627543 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9aeec-f182-49b1-9064-352e3bd2fe9b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.627611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9aeec-f182-49b1-9064-352e3bd2fe9b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.637734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9aeec-f182-49b1-9064-352e3bd2fe9b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.637779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9aeec-f182-49b1-9064-352e3bd2fe9b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.645032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.645111 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.652052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btgfl\" (UniqueName: \"kubernetes.io/projected/95d9aeec-f182-49b1-9064-352e3bd2fe9b-kube-api-access-btgfl\") pod \"nova-cell1-conductor-0\" (UID: \"95d9aeec-f182-49b1-9064-352e3bd2fe9b\") " pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:58 crc kubenswrapper[4707]: I0218 06:08:58.719379 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.127921 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.238955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-config-data\") pod \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.239177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfkpb\" (UniqueName: \"kubernetes.io/projected/0fddf93f-f494-44f8-aed3-231f9e1b84a8-kube-api-access-hfkpb\") pod \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.239923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-nova-metadata-tls-certs\") pod \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.239999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-combined-ca-bundle\") pod \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.240131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fddf93f-f494-44f8-aed3-231f9e1b84a8-logs\") pod \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\" (UID: \"0fddf93f-f494-44f8-aed3-231f9e1b84a8\") " Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.241060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fddf93f-f494-44f8-aed3-231f9e1b84a8-logs" (OuterVolumeSpecName: "logs") pod "0fddf93f-f494-44f8-aed3-231f9e1b84a8" (UID: "0fddf93f-f494-44f8-aed3-231f9e1b84a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.245351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fddf93f-f494-44f8-aed3-231f9e1b84a8-kube-api-access-hfkpb" (OuterVolumeSpecName: "kube-api-access-hfkpb") pod "0fddf93f-f494-44f8-aed3-231f9e1b84a8" (UID: "0fddf93f-f494-44f8-aed3-231f9e1b84a8"). InnerVolumeSpecName "kube-api-access-hfkpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.280505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-config-data" (OuterVolumeSpecName: "config-data") pod "0fddf93f-f494-44f8-aed3-231f9e1b84a8" (UID: "0fddf93f-f494-44f8-aed3-231f9e1b84a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.282924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fddf93f-f494-44f8-aed3-231f9e1b84a8" (UID: "0fddf93f-f494-44f8-aed3-231f9e1b84a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.308751 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 06:08:59 crc kubenswrapper[4707]: W0218 06:08:59.309987 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d9aeec_f182_49b1_9064_352e3bd2fe9b.slice/crio-cf62f0f29d5caadb61d5c3902a6810230dcf9cbb2326ae8b819dc1355b22a47c WatchSource:0}: Error finding container cf62f0f29d5caadb61d5c3902a6810230dcf9cbb2326ae8b819dc1355b22a47c: Status 404 returned error can't find the container with id cf62f0f29d5caadb61d5c3902a6810230dcf9cbb2326ae8b819dc1355b22a47c Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310049 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerID="2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6" exitCode=0 Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0fddf93f-f494-44f8-aed3-231f9e1b84a8" (UID: "0fddf93f-f494-44f8-aed3-231f9e1b84a8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310151 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerID="ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101" exitCode=143 Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fddf93f-f494-44f8-aed3-231f9e1b84a8","Type":"ContainerDied","Data":"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6"} Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310605 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fddf93f-f494-44f8-aed3-231f9e1b84a8","Type":"ContainerDied","Data":"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101"} Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fddf93f-f494-44f8-aed3-231f9e1b84a8","Type":"ContainerDied","Data":"13614d955c155e7520ffb715532626484e9914abf6892cb8586c6194429c9669"} Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.310642 4707 scope.go:117] "RemoveContainer" containerID="2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.316146 4707 generic.go:334] "Generic (PLEG): container finished" podID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerID="e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51" exitCode=143 Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.316183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0b84cf9d-dc6d-4210-acd8-f383978d61d5","Type":"ContainerDied","Data":"e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51"} Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.340156 4707 scope.go:117] "RemoveContainer" containerID="ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.342835 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fddf93f-f494-44f8-aed3-231f9e1b84a8-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.342936 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.343006 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfkpb\" (UniqueName: \"kubernetes.io/projected/0fddf93f-f494-44f8-aed3-231f9e1b84a8-kube-api-access-hfkpb\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.343085 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.343169 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fddf93f-f494-44f8-aed3-231f9e1b84a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.385366 4707 scope.go:117] "RemoveContainer" containerID="2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6" Feb 18 06:08:59 crc kubenswrapper[4707]: E0218 06:08:59.385946 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6\": container with ID starting with 2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6 not found: ID does not exist" containerID="2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.385977 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6"} err="failed to get container status \"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6\": rpc error: code = NotFound desc = could not find container \"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6\": container with ID starting with 2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6 not found: ID does not exist" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.385998 4707 scope.go:117] "RemoveContainer" containerID="ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101" Feb 18 06:08:59 crc kubenswrapper[4707]: E0218 06:08:59.386338 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101\": container with ID starting with ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101 not found: ID does not exist" containerID="ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.386388 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101"} err="failed to get container status \"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101\": rpc error: code = NotFound desc = could not find container \"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101\": container with ID starting with ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101 not found: ID does not exist" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.386418 4707 scope.go:117] "RemoveContainer" containerID="2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.386706 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6"} err="failed to get container status \"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6\": rpc error: code = NotFound desc = could not find container \"2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6\": container with ID starting with 2ce9498f512242272c80fb49f1cb453bb75db44701f237affc6df03bb552dab6 not found: ID does not exist" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.386743 4707 scope.go:117] "RemoveContainer" containerID="ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.387182 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101"} err="failed to get container status \"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101\": rpc error: code = NotFound desc = could not find container \"ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101\": container with ID starting with ce9cc1f6bf58dfc919287da1bc165e51bb88a9894845c7b7cfa58bcf7cdce101 not found: ID does not exist" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.775013 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.792938 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.808310 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:59 crc kubenswrapper[4707]: E0218 06:08:59.808776 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-log" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.808851 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-log" Feb 18 06:08:59 crc kubenswrapper[4707]: E0218 06:08:59.808869 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-metadata" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.808875 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-metadata" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.809075 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-metadata" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.809106 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" containerName="nova-metadata-log" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.810132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.812138 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.816178 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.822515 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.855902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.855949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9sz\" (UniqueName: \"kubernetes.io/projected/55fa9858-323a-452c-a0aa-7a1207e40ca2-kube-api-access-sc9sz\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.856037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fa9858-323a-452c-a0aa-7a1207e40ca2-logs\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.856143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-config-data\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.856171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.958125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-config-data\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.958174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.958215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.958253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9sz\" (UniqueName: \"kubernetes.io/projected/55fa9858-323a-452c-a0aa-7a1207e40ca2-kube-api-access-sc9sz\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.958320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fa9858-323a-452c-a0aa-7a1207e40ca2-logs\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.959655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fa9858-323a-452c-a0aa-7a1207e40ca2-logs\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.965435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.965466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-config-data\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.973024 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:08:59 crc kubenswrapper[4707]: I0218 06:08:59.978125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9sz\" (UniqueName: \"kubernetes.io/projected/55fa9858-323a-452c-a0aa-7a1207e40ca2-kube-api-access-sc9sz\") pod \"nova-metadata-0\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " pod="openstack/nova-metadata-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.058067 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.074385 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fddf93f-f494-44f8-aed3-231f9e1b84a8" path="/var/lib/kubelet/pods/0fddf93f-f494-44f8-aed3-231f9e1b84a8/volumes" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.132210 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.162254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-kube-api-access-hpvkk\") pod \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.162365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-combined-ca-bundle\") pod \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.162515 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-config-data\") pod \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\" (UID: \"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809\") " Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.172647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-kube-api-access-hpvkk" (OuterVolumeSpecName: "kube-api-access-hpvkk") pod "f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" (UID: "f7ec90c2-23aa-44b3-9086-fcc3f6c9a809"). InnerVolumeSpecName "kube-api-access-hpvkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.192008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-config-data" (OuterVolumeSpecName: "config-data") pod "f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" (UID: "f7ec90c2-23aa-44b3-9086-fcc3f6c9a809"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.194468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" (UID: "f7ec90c2-23aa-44b3-9086-fcc3f6c9a809"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.265473 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.265525 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-kube-api-access-hpvkk\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.265538 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.331962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9aeec-f182-49b1-9064-352e3bd2fe9b","Type":"ContainerStarted","Data":"ae20148f8adbf16ad8a04160dc14f6f86013a7a73e9e54cbb6430e8b042ed37b"} Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.332008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9aeec-f182-49b1-9064-352e3bd2fe9b","Type":"ContainerStarted","Data":"cf62f0f29d5caadb61d5c3902a6810230dcf9cbb2326ae8b819dc1355b22a47c"} Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.333228 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.337886 4707 generic.go:334] "Generic (PLEG): container finished" podID="f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" containerID="4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505" exitCode=0 Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.337916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809","Type":"ContainerDied","Data":"4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505"} Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.337932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7ec90c2-23aa-44b3-9086-fcc3f6c9a809","Type":"ContainerDied","Data":"be6aa0003a8cf64ed0fb24cb6af7bc1a701eb8ab80bb428c5c5328534ac6993e"} Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.337948 4707 scope.go:117] "RemoveContainer" containerID="4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.338008 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.355204 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.355185306 podStartE2EDuration="2.355185306s" podCreationTimestamp="2026-02-18 06:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:00.3456382 +0000 UTC m=+1276.993597334" watchObservedRunningTime="2026-02-18 06:09:00.355185306 +0000 UTC m=+1277.003144440" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.388331 4707 scope.go:117] "RemoveContainer" containerID="4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505" Feb 18 06:09:00 crc kubenswrapper[4707]: E0218 06:09:00.391660 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505\": container with ID starting with 4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505 not found: ID does not exist" containerID="4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.391712 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505"} err="failed to get container status \"4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505\": rpc error: code = NotFound desc = could not find container \"4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505\": container with ID starting with 4f81c438a46b7b119752eb2cb5deb1d040aba5aaece4dc8fb17665467f02b505 not found: ID does not exist" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.392479 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.405938 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.415157 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:00 crc kubenswrapper[4707]: E0218 06:09:00.415741 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" containerName="nova-scheduler-scheduler" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.416145 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" containerName="nova-scheduler-scheduler" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.416541 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" containerName="nova-scheduler-scheduler" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.417519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.422738 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.423552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.472107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhmd\" (UniqueName: \"kubernetes.io/projected/a64cc687-253e-4b0b-8394-a85a561826cc-kube-api-access-7vhmd\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.472290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.472416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-config-data\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.574126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-config-data\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.574270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhmd\" (UniqueName: \"kubernetes.io/projected/a64cc687-253e-4b0b-8394-a85a561826cc-kube-api-access-7vhmd\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.574447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.580507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-config-data\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.580691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.592491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhmd\" (UniqueName: \"kubernetes.io/projected/a64cc687-253e-4b0b-8394-a85a561826cc-kube-api-access-7vhmd\") pod \"nova-scheduler-0\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.604725 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:09:00 crc kubenswrapper[4707]: W0218 06:09:00.609664 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55fa9858_323a_452c_a0aa_7a1207e40ca2.slice/crio-24f597ba259204c97b1c040feb9ac8eda9271a28ddf3d45d67cbe61829e543e2 WatchSource:0}: Error finding container 24f597ba259204c97b1c040feb9ac8eda9271a28ddf3d45d67cbe61829e543e2: Status 404 returned error can't find the container with id 24f597ba259204c97b1c040feb9ac8eda9271a28ddf3d45d67cbe61829e543e2 Feb 18 06:09:00 crc kubenswrapper[4707]: I0218 06:09:00.740819 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:01 crc kubenswrapper[4707]: W0218 06:09:01.258968 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64cc687_253e_4b0b_8394_a85a561826cc.slice/crio-9b2ca0db34405feec2ef54155195010a2e77309491ecd761d8a638973b1f9537 WatchSource:0}: Error finding container 9b2ca0db34405feec2ef54155195010a2e77309491ecd761d8a638973b1f9537: Status 404 returned error can't find the container with id 9b2ca0db34405feec2ef54155195010a2e77309491ecd761d8a638973b1f9537 Feb 18 06:09:01 crc kubenswrapper[4707]: I0218 06:09:01.265902 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:01 crc kubenswrapper[4707]: I0218 06:09:01.355233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a64cc687-253e-4b0b-8394-a85a561826cc","Type":"ContainerStarted","Data":"9b2ca0db34405feec2ef54155195010a2e77309491ecd761d8a638973b1f9537"} Feb 18 06:09:01 crc kubenswrapper[4707]: I0218 06:09:01.357173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55fa9858-323a-452c-a0aa-7a1207e40ca2","Type":"ContainerStarted","Data":"680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814"} Feb 18 06:09:01 crc kubenswrapper[4707]: I0218 06:09:01.357244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55fa9858-323a-452c-a0aa-7a1207e40ca2","Type":"ContainerStarted","Data":"4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31"} Feb 18 06:09:01 crc kubenswrapper[4707]: I0218 06:09:01.357260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55fa9858-323a-452c-a0aa-7a1207e40ca2","Type":"ContainerStarted","Data":"24f597ba259204c97b1c040feb9ac8eda9271a28ddf3d45d67cbe61829e543e2"} Feb 18 06:09:01 crc kubenswrapper[4707]: I0218 06:09:01.378225 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.378209901 podStartE2EDuration="2.378209901s" podCreationTimestamp="2026-02-18 06:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:01.374009569 +0000 UTC m=+1278.021968703" watchObservedRunningTime="2026-02-18 06:09:01.378209901 +0000 UTC m=+1278.026169035" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.067618 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ec90c2-23aa-44b3-9086-fcc3f6c9a809" path="/var/lib/kubelet/pods/f7ec90c2-23aa-44b3-9086-fcc3f6c9a809/volumes" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.316442 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.419234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2xrn\" (UniqueName: \"kubernetes.io/projected/0b84cf9d-dc6d-4210-acd8-f383978d61d5-kube-api-access-w2xrn\") pod \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.419383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-combined-ca-bundle\") pod \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.419492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b84cf9d-dc6d-4210-acd8-f383978d61d5-logs\") pod \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.419618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-config-data\") pod \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\" (UID: \"0b84cf9d-dc6d-4210-acd8-f383978d61d5\") " Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.421425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b84cf9d-dc6d-4210-acd8-f383978d61d5-logs" (OuterVolumeSpecName: "logs") pod "0b84cf9d-dc6d-4210-acd8-f383978d61d5" (UID: "0b84cf9d-dc6d-4210-acd8-f383978d61d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.428237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b84cf9d-dc6d-4210-acd8-f383978d61d5-kube-api-access-w2xrn" (OuterVolumeSpecName: "kube-api-access-w2xrn") pod "0b84cf9d-dc6d-4210-acd8-f383978d61d5" (UID: "0b84cf9d-dc6d-4210-acd8-f383978d61d5"). InnerVolumeSpecName "kube-api-access-w2xrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.433221 4707 generic.go:334] "Generic (PLEG): container finished" podID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerID="dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4" exitCode=0 Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.433328 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0b84cf9d-dc6d-4210-acd8-f383978d61d5","Type":"ContainerDied","Data":"dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4"} Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.433401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0b84cf9d-dc6d-4210-acd8-f383978d61d5","Type":"ContainerDied","Data":"27f9d59c2ec2fe742ca7b91f7e55e59ba8d246aaed31e6e5e51b6b4ef8cf66ed"} Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.433323 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.433428 4707 scope.go:117] "RemoveContainer" containerID="dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.445914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a64cc687-253e-4b0b-8394-a85a561826cc","Type":"ContainerStarted","Data":"f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79"} Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.466709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-config-data" (OuterVolumeSpecName: "config-data") pod "0b84cf9d-dc6d-4210-acd8-f383978d61d5" (UID: "0b84cf9d-dc6d-4210-acd8-f383978d61d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.469222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b84cf9d-dc6d-4210-acd8-f383978d61d5" (UID: "0b84cf9d-dc6d-4210-acd8-f383978d61d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.473767 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.473749053 podStartE2EDuration="2.473749053s" podCreationTimestamp="2026-02-18 06:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:02.463454707 +0000 UTC m=+1279.111413841" watchObservedRunningTime="2026-02-18 06:09:02.473749053 +0000 UTC m=+1279.121708187" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.490553 4707 scope.go:117] "RemoveContainer" containerID="e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.516776 4707 scope.go:117] "RemoveContainer" containerID="dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4" Feb 18 06:09:02 crc kubenswrapper[4707]: E0218 06:09:02.517326 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4\": container with ID starting with dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4 not found: ID does not exist" containerID="dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.517373 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4"} err="failed to get container status \"dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4\": rpc error: code = NotFound desc = could not find container \"dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4\": container with ID starting with dea3507d7215c4c42c0aa87e4cbf4849512e83d6ecb99041b274556818187da4 not found: ID does not exist" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.517401 4707 scope.go:117] "RemoveContainer" containerID="e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51" Feb 18 06:09:02 crc kubenswrapper[4707]: E0218 06:09:02.517838 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51\": container with ID starting with e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51 not found: ID does not exist" containerID="e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.517880 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51"} err="failed to get container status \"e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51\": rpc error: code = NotFound desc = could not find container \"e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51\": container with ID starting with e26319e91fc55e08fc3f1c44b3eb32d793708f10ab5ae2535a6cb69b8aa69c51 not found: ID does not exist" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.522121 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.522143 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b84cf9d-dc6d-4210-acd8-f383978d61d5-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.522155 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b84cf9d-dc6d-4210-acd8-f383978d61d5-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.522165 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2xrn\" (UniqueName: \"kubernetes.io/projected/0b84cf9d-dc6d-4210-acd8-f383978d61d5-kube-api-access-w2xrn\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.770178 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.780759 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.795525 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:02 crc kubenswrapper[4707]: E0218 06:09:02.796246 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-log" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.796264 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-log" Feb 18 06:09:02 crc kubenswrapper[4707]: E0218 06:09:02.796299 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-api" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.796309 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-api" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.796522 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-log" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.796547 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" containerName="nova-api-api" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.797538 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.799813 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.818225 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.929737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-config-data\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.929835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33019be9-adbc-447e-9072-71ad0fa08a14-logs\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.930507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cw26\" (UniqueName: \"kubernetes.io/projected/33019be9-adbc-447e-9072-71ad0fa08a14-kube-api-access-5cw26\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:02 crc kubenswrapper[4707]: I0218 06:09:02.931006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.032556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-config-data\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.032649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33019be9-adbc-447e-9072-71ad0fa08a14-logs\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.032707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cw26\" (UniqueName: \"kubernetes.io/projected/33019be9-adbc-447e-9072-71ad0fa08a14-kube-api-access-5cw26\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.032772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.033709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33019be9-adbc-447e-9072-71ad0fa08a14-logs\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.037279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.037876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-config-data\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.053251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cw26\" (UniqueName: \"kubernetes.io/projected/33019be9-adbc-447e-9072-71ad0fa08a14-kube-api-access-5cw26\") pod \"nova-api-0\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.130776 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:03 crc kubenswrapper[4707]: I0218 06:09:03.604953 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:03 crc kubenswrapper[4707]: W0218 06:09:03.606664 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33019be9_adbc_447e_9072_71ad0fa08a14.slice/crio-757fe80d937ddda88563e8ee502d56e313b9ef1b401b2a57be145fbd7e783348 WatchSource:0}: Error finding container 757fe80d937ddda88563e8ee502d56e313b9ef1b401b2a57be145fbd7e783348: Status 404 returned error can't find the container with id 757fe80d937ddda88563e8ee502d56e313b9ef1b401b2a57be145fbd7e783348 Feb 18 06:09:04 crc kubenswrapper[4707]: I0218 06:09:04.066930 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b84cf9d-dc6d-4210-acd8-f383978d61d5" path="/var/lib/kubelet/pods/0b84cf9d-dc6d-4210-acd8-f383978d61d5/volumes" Feb 18 06:09:04 crc kubenswrapper[4707]: I0218 06:09:04.467037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33019be9-adbc-447e-9072-71ad0fa08a14","Type":"ContainerStarted","Data":"01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce"} Feb 18 06:09:04 crc kubenswrapper[4707]: I0218 06:09:04.467438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33019be9-adbc-447e-9072-71ad0fa08a14","Type":"ContainerStarted","Data":"57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355"} Feb 18 06:09:04 crc kubenswrapper[4707]: I0218 06:09:04.467449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33019be9-adbc-447e-9072-71ad0fa08a14","Type":"ContainerStarted","Data":"757fe80d937ddda88563e8ee502d56e313b9ef1b401b2a57be145fbd7e783348"} Feb 18 06:09:04 crc kubenswrapper[4707]: I0218 06:09:04.488092 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.488074761 podStartE2EDuration="2.488074761s" podCreationTimestamp="2026-02-18 06:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:04.480507588 +0000 UTC m=+1281.128466722" watchObservedRunningTime="2026-02-18 06:09:04.488074761 +0000 UTC m=+1281.136033895" Feb 18 06:09:05 crc kubenswrapper[4707]: I0218 06:09:05.133032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:09:05 crc kubenswrapper[4707]: I0218 06:09:05.133313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:09:05 crc kubenswrapper[4707]: I0218 06:09:05.741367 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 06:09:08 crc kubenswrapper[4707]: I0218 06:09:08.747450 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 18 06:09:10 crc kubenswrapper[4707]: I0218 06:09:10.132337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:09:10 crc kubenswrapper[4707]: I0218 06:09:10.132459 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:09:10 crc kubenswrapper[4707]: I0218 06:09:10.741962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 06:09:10 crc kubenswrapper[4707]: I0218 06:09:10.769513 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 06:09:11 crc kubenswrapper[4707]: I0218 06:09:11.146940 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:11 crc kubenswrapper[4707]: I0218 06:09:11.146940 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:11 crc kubenswrapper[4707]: I0218 06:09:11.567191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 06:09:13 crc kubenswrapper[4707]: I0218 06:09:13.131187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:09:13 crc kubenswrapper[4707]: I0218 06:09:13.131558 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:09:14 crc kubenswrapper[4707]: I0218 06:09:14.216540 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:14 crc kubenswrapper[4707]: I0218 06:09:14.216578 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:14 crc kubenswrapper[4707]: I0218 06:09:14.660198 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 06:09:20 crc kubenswrapper[4707]: I0218 06:09:20.138179 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:09:20 crc kubenswrapper[4707]: I0218 06:09:20.139805 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:09:20 crc kubenswrapper[4707]: I0218 06:09:20.146485 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:09:20 crc kubenswrapper[4707]: I0218 06:09:20.682406 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.381966 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.382346 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.382396 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.383381 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3eb8d09ea3950a1c29c70e73d11ea5133c61c40a9512fdef46057924b3898430"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.383443 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://3eb8d09ea3950a1c29c70e73d11ea5133c61c40a9512fdef46057924b3898430" gracePeriod=600 Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.688236 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="3eb8d09ea3950a1c29c70e73d11ea5133c61c40a9512fdef46057924b3898430" exitCode=0 Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.688325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"3eb8d09ea3950a1c29c70e73d11ea5133c61c40a9512fdef46057924b3898430"} Feb 18 06:09:21 crc kubenswrapper[4707]: I0218 06:09:21.688863 4707 scope.go:117] "RemoveContainer" containerID="27b00527a1a2dd19572cd4b34eeb62edfbb26a8ff621d8e0ad7b7a217cf69cd3" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.683207 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.698020 4707 generic.go:334] "Generic (PLEG): container finished" podID="8acba92d-11e1-467a-aa8e-9cc9078c5dc2" containerID="e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a" exitCode=137 Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.698088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8acba92d-11e1-467a-aa8e-9cc9078c5dc2","Type":"ContainerDied","Data":"e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a"} Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.698120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8acba92d-11e1-467a-aa8e-9cc9078c5dc2","Type":"ContainerDied","Data":"279467e5694bc782d595e526c4b3fe39ef47712192ca6f60f6107e5d4f262267"} Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.698138 4707 scope.go:117] "RemoveContainer" containerID="e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.698156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.701209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af"} Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.767790 4707 scope.go:117] "RemoveContainer" containerID="e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a" Feb 18 06:09:22 crc kubenswrapper[4707]: E0218 06:09:22.768529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a\": container with ID starting with e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a not found: ID does not exist" containerID="e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.768570 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a"} err="failed to get container status \"e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a\": rpc error: code = NotFound desc = could not find container \"e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a\": container with ID starting with e82cc16a22bb2a991887e468ad0cf8beca683be5fd09964707534293a9d6d82a not found: ID does not exist" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.775349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxhnz\" (UniqueName: \"kubernetes.io/projected/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-kube-api-access-cxhnz\") pod \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.775604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-config-data\") pod \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.775708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-combined-ca-bundle\") pod \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\" (UID: \"8acba92d-11e1-467a-aa8e-9cc9078c5dc2\") " Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.792092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-kube-api-access-cxhnz" (OuterVolumeSpecName: "kube-api-access-cxhnz") pod "8acba92d-11e1-467a-aa8e-9cc9078c5dc2" (UID: "8acba92d-11e1-467a-aa8e-9cc9078c5dc2"). InnerVolumeSpecName "kube-api-access-cxhnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.804980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-config-data" (OuterVolumeSpecName: "config-data") pod "8acba92d-11e1-467a-aa8e-9cc9078c5dc2" (UID: "8acba92d-11e1-467a-aa8e-9cc9078c5dc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.805626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8acba92d-11e1-467a-aa8e-9cc9078c5dc2" (UID: "8acba92d-11e1-467a-aa8e-9cc9078c5dc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.878119 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxhnz\" (UniqueName: \"kubernetes.io/projected/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-kube-api-access-cxhnz\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.878308 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:22 crc kubenswrapper[4707]: I0218 06:09:22.878394 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acba92d-11e1-467a-aa8e-9cc9078c5dc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.032237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.040640 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.064652 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:09:23 crc kubenswrapper[4707]: E0218 06:09:23.065785 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8acba92d-11e1-467a-aa8e-9cc9078c5dc2" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.065808 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8acba92d-11e1-467a-aa8e-9cc9078c5dc2" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.066097 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8acba92d-11e1-467a-aa8e-9cc9078c5dc2" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.066978 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.069376 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.069688 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.071402 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.077541 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.135860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.136438 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.139550 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.139605 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.184395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvbd\" (UniqueName: \"kubernetes.io/projected/891ed851-3533-43e4-a60b-791e4ebd0afa-kube-api-access-5dvbd\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.184440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.184476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.184515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.184549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.285739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.285852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.285998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvbd\" (UniqueName: \"kubernetes.io/projected/891ed851-3533-43e4-a60b-791e4ebd0afa-kube-api-access-5dvbd\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.286019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.286051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.292662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.292697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.299996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.300167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891ed851-3533-43e4-a60b-791e4ebd0afa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.302513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvbd\" (UniqueName: \"kubernetes.io/projected/891ed851-3533-43e4-a60b-791e4ebd0afa-kube-api-access-5dvbd\") pod \"nova-cell1-novncproxy-0\" (UID: \"891ed851-3533-43e4-a60b-791e4ebd0afa\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.390209 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.712407 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.716597 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:09:23 crc kubenswrapper[4707]: I0218 06:09:23.855157 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.049665 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9d8756bc-mqn6s"] Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.051524 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.069176 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8acba92d-11e1-467a-aa8e-9cc9078c5dc2" path="/var/lib/kubelet/pods/8acba92d-11e1-467a-aa8e-9cc9078c5dc2/volumes" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.171277 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9d8756bc-mqn6s"] Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.212128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.212181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-svc\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.212213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.212283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdpbr\" (UniqueName: \"kubernetes.io/projected/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-kube-api-access-qdpbr\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.212322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.212400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-config\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.314594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-config\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.314675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.314694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-svc\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.314720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.314774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdpbr\" (UniqueName: \"kubernetes.io/projected/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-kube-api-access-qdpbr\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.314823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.315987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.317461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.317502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.317983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-svc\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.318201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-config\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.331251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdpbr\" (UniqueName: \"kubernetes.io/projected/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-kube-api-access-qdpbr\") pod \"dnsmasq-dns-6b9d8756bc-mqn6s\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.543419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.726134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"891ed851-3533-43e4-a60b-791e4ebd0afa","Type":"ContainerStarted","Data":"67f01b0df179f6fcb143f2be35bdfb4a7bdeaa2e20f7d13afcb5db35ab48904c"} Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.726440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"891ed851-3533-43e4-a60b-791e4ebd0afa","Type":"ContainerStarted","Data":"dabed2541a753d76bc59b21f88dbe9abbcc8c31ca2b9c25e4f3a05f202f1cd24"} Feb 18 06:09:24 crc kubenswrapper[4707]: I0218 06:09:24.764286 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.7642692150000001 podStartE2EDuration="1.764269215s" podCreationTimestamp="2026-02-18 06:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:24.750960268 +0000 UTC m=+1301.398919402" watchObservedRunningTime="2026-02-18 06:09:24.764269215 +0000 UTC m=+1301.412228349" Feb 18 06:09:25 crc kubenswrapper[4707]: I0218 06:09:25.080553 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9d8756bc-mqn6s"] Feb 18 06:09:25 crc kubenswrapper[4707]: I0218 06:09:25.734041 4707 generic.go:334] "Generic (PLEG): container finished" podID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerID="e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604" exitCode=0 Feb 18 06:09:25 crc kubenswrapper[4707]: I0218 06:09:25.735837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" event={"ID":"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b","Type":"ContainerDied","Data":"e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604"} Feb 18 06:09:25 crc kubenswrapper[4707]: I0218 06:09:25.735865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" event={"ID":"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b","Type":"ContainerStarted","Data":"79bd3d0b96a0314fa0ecc1d675ff57b9e19b8b53fb59fc22aa092f1e50252b5f"} Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.182572 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.184485 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-central-agent" containerID="cri-o://ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c" gracePeriod=30 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.184561 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="proxy-httpd" containerID="cri-o://0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a" gracePeriod=30 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.184627 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="sg-core" containerID="cri-o://75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e" gracePeriod=30 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.184679 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-notification-agent" containerID="cri-o://daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5" gracePeriod=30 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.696550 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.747928 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0884a8f-f290-4556-9808-47963ef4cd51" containerID="0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a" exitCode=0 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.747970 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0884a8f-f290-4556-9808-47963ef4cd51" containerID="75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e" exitCode=2 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.747987 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0884a8f-f290-4556-9808-47963ef4cd51" containerID="ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c" exitCode=0 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.748040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerDied","Data":"0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a"} Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.748074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerDied","Data":"75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e"} Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.748087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerDied","Data":"ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c"} Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.750671 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-log" containerID="cri-o://57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355" gracePeriod=30 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.751897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" event={"ID":"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b","Type":"ContainerStarted","Data":"6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107"} Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.751937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.752317 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-api" containerID="cri-o://01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce" gracePeriod=30 Feb 18 06:09:26 crc kubenswrapper[4707]: I0218 06:09:26.788698 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" podStartSLOduration=3.788677931 podStartE2EDuration="3.788677931s" podCreationTimestamp="2026-02-18 06:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:26.785470625 +0000 UTC m=+1303.433429759" watchObservedRunningTime="2026-02-18 06:09:26.788677931 +0000 UTC m=+1303.436637065" Feb 18 06:09:27 crc kubenswrapper[4707]: I0218 06:09:27.761501 4707 generic.go:334] "Generic (PLEG): container finished" podID="33019be9-adbc-447e-9072-71ad0fa08a14" containerID="57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355" exitCode=143 Feb 18 06:09:27 crc kubenswrapper[4707]: I0218 06:09:27.761569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33019be9-adbc-447e-9072-71ad0fa08a14","Type":"ContainerDied","Data":"57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355"} Feb 18 06:09:28 crc kubenswrapper[4707]: I0218 06:09:28.391575 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.618913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.637305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-config-data\") pod \"33019be9-adbc-447e-9072-71ad0fa08a14\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.637372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-combined-ca-bundle\") pod \"33019be9-adbc-447e-9072-71ad0fa08a14\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.637449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33019be9-adbc-447e-9072-71ad0fa08a14-logs\") pod \"33019be9-adbc-447e-9072-71ad0fa08a14\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.637495 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cw26\" (UniqueName: \"kubernetes.io/projected/33019be9-adbc-447e-9072-71ad0fa08a14-kube-api-access-5cw26\") pod \"33019be9-adbc-447e-9072-71ad0fa08a14\" (UID: \"33019be9-adbc-447e-9072-71ad0fa08a14\") " Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.637992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33019be9-adbc-447e-9072-71ad0fa08a14-logs" (OuterVolumeSpecName: "logs") pod "33019be9-adbc-447e-9072-71ad0fa08a14" (UID: "33019be9-adbc-447e-9072-71ad0fa08a14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.638334 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33019be9-adbc-447e-9072-71ad0fa08a14-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.664422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33019be9-adbc-447e-9072-71ad0fa08a14-kube-api-access-5cw26" (OuterVolumeSpecName: "kube-api-access-5cw26") pod "33019be9-adbc-447e-9072-71ad0fa08a14" (UID: "33019be9-adbc-447e-9072-71ad0fa08a14"). InnerVolumeSpecName "kube-api-access-5cw26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.690973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-config-data" (OuterVolumeSpecName: "config-data") pod "33019be9-adbc-447e-9072-71ad0fa08a14" (UID: "33019be9-adbc-447e-9072-71ad0fa08a14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.723950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33019be9-adbc-447e-9072-71ad0fa08a14" (UID: "33019be9-adbc-447e-9072-71ad0fa08a14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.740531 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.740563 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33019be9-adbc-447e-9072-71ad0fa08a14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.740576 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cw26\" (UniqueName: \"kubernetes.io/projected/33019be9-adbc-447e-9072-71ad0fa08a14-kube-api-access-5cw26\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.801314 4707 generic.go:334] "Generic (PLEG): container finished" podID="33019be9-adbc-447e-9072-71ad0fa08a14" containerID="01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce" exitCode=0 Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.801386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33019be9-adbc-447e-9072-71ad0fa08a14","Type":"ContainerDied","Data":"01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce"} Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.801523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33019be9-adbc-447e-9072-71ad0fa08a14","Type":"ContainerDied","Data":"757fe80d937ddda88563e8ee502d56e313b9ef1b401b2a57be145fbd7e783348"} Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.801557 4707 scope.go:117] "RemoveContainer" containerID="01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.801824 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.836728 4707 scope.go:117] "RemoveContainer" containerID="57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.864052 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.865400 4707 scope.go:117] "RemoveContainer" containerID="01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce" Feb 18 06:09:30 crc kubenswrapper[4707]: E0218 06:09:30.865903 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce\": container with ID starting with 01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce not found: ID does not exist" containerID="01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.865941 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce"} err="failed to get container status \"01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce\": rpc error: code = NotFound desc = could not find container \"01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce\": container with ID starting with 01c7bef6a0223c7e07525473dec86d8cfdeda54afff6626027274625429c56ce not found: ID does not exist" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.865982 4707 scope.go:117] "RemoveContainer" containerID="57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355" Feb 18 06:09:30 crc kubenswrapper[4707]: E0218 06:09:30.866337 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355\": container with ID starting with 57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355 not found: ID does not exist" containerID="57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.866436 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355"} err="failed to get container status \"57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355\": rpc error: code = NotFound desc = could not find container \"57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355\": container with ID starting with 57fdedfd8ad947eaac8ca25472150127152f9b235a6bdc578354ee5e50ae6355 not found: ID does not exist" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.889576 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.909101 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:30 crc kubenswrapper[4707]: E0218 06:09:30.909942 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-api" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.910025 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-api" Feb 18 06:09:30 crc kubenswrapper[4707]: E0218 06:09:30.910084 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-log" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.910142 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-log" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.910428 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-log" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.910500 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" containerName="nova-api-api" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.911760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.914571 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.914844 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.915045 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.930929 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.964473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.964532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.964611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-logs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.964662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.964737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc58p\" (UniqueName: \"kubernetes.io/projected/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-kube-api-access-jc58p\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:30 crc kubenswrapper[4707]: I0218 06:09:30.964779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-config-data\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.066398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.066487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-logs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.066528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.066600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc58p\" (UniqueName: \"kubernetes.io/projected/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-kube-api-access-jc58p\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.066636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-config-data\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.066674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.067019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-logs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.072950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.075351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.075992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-config-data\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.076253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.084053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc58p\" (UniqueName: \"kubernetes.io/projected/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-kube-api-access-jc58p\") pod \"nova-api-0\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.233263 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.482919 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.576837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-sg-core-conf-yaml\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.576981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-ceilometer-tls-certs\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.577015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-log-httpd\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.577063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-run-httpd\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.577120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2frp\" (UniqueName: \"kubernetes.io/projected/f0884a8f-f290-4556-9808-47963ef4cd51-kube-api-access-q2frp\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.577177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-config-data\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.577200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-combined-ca-bundle\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.577241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-scripts\") pod \"f0884a8f-f290-4556-9808-47963ef4cd51\" (UID: \"f0884a8f-f290-4556-9808-47963ef4cd51\") " Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.578400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.580741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.584054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-scripts" (OuterVolumeSpecName: "scripts") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.584222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0884a8f-f290-4556-9808-47963ef4cd51-kube-api-access-q2frp" (OuterVolumeSpecName: "kube-api-access-q2frp") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "kube-api-access-q2frp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.610594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.642768 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.680755 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.681248 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.681552 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.682178 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0884a8f-f290-4556-9808-47963ef4cd51-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.682339 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2frp\" (UniqueName: \"kubernetes.io/projected/f0884a8f-f290-4556-9808-47963ef4cd51-kube-api-access-q2frp\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.682630 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.695476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.717878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-config-data" (OuterVolumeSpecName: "config-data") pod "f0884a8f-f290-4556-9808-47963ef4cd51" (UID: "f0884a8f-f290-4556-9808-47963ef4cd51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.719409 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.784862 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.784898 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0884a8f-f290-4556-9808-47963ef4cd51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.817227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe","Type":"ContainerStarted","Data":"c3472d9535cb6dffc73a720a73fb9b12ed5d47366a5b8edef35d442ad0bbb13f"} Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.820513 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0884a8f-f290-4556-9808-47963ef4cd51" containerID="daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5" exitCode=0 Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.820544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerDied","Data":"daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5"} Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.820563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0884a8f-f290-4556-9808-47963ef4cd51","Type":"ContainerDied","Data":"f482efa0adf1dde8bf86af08aaa6b1d7d8e2728913b959b5ac98b0b53d3d1c67"} Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.820579 4707 scope.go:117] "RemoveContainer" containerID="0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.820704 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.856332 4707 scope.go:117] "RemoveContainer" containerID="75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.861849 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.890177 4707 scope.go:117] "RemoveContainer" containerID="daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.893952 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.908172 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.908747 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-central-agent" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.908772 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-central-agent" Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.908823 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="sg-core" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.908833 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="sg-core" Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.908846 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="proxy-httpd" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.908857 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="proxy-httpd" Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.908872 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-notification-agent" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.908888 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-notification-agent" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.909138 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="sg-core" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.909157 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="proxy-httpd" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.909171 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-central-agent" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.909186 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" containerName="ceilometer-notification-agent" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.911119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.916297 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.917054 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.922257 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.938080 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.944355 4707 scope.go:117] "RemoveContainer" containerID="ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.969512 4707 scope.go:117] "RemoveContainer" containerID="0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a" Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.970849 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a\": container with ID starting with 0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a not found: ID does not exist" containerID="0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.970984 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a"} err="failed to get container status \"0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a\": rpc error: code = NotFound desc = could not find container \"0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a\": container with ID starting with 0453febf56610ed90951cc4aba2e58495c04e44779564f470f87a1a4ece6c84a not found: ID does not exist" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.971082 4707 scope.go:117] "RemoveContainer" containerID="75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e" Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.971591 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e\": container with ID starting with 75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e not found: ID does not exist" containerID="75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.971700 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e"} err="failed to get container status \"75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e\": rpc error: code = NotFound desc = could not find container \"75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e\": container with ID starting with 75387384b19e88941b6b658095c22ab2e99126e7bb67f678ce45991bd98def8e not found: ID does not exist" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.971789 4707 scope.go:117] "RemoveContainer" containerID="daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5" Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.972250 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5\": container with ID starting with daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5 not found: ID does not exist" containerID="daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.972346 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5"} err="failed to get container status \"daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5\": rpc error: code = NotFound desc = could not find container \"daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5\": container with ID starting with daf5356cfac8981f3bd3f03335d471bb9ad1da4cbc02f6d852f2bb4f029690e5 not found: ID does not exist" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.972414 4707 scope.go:117] "RemoveContainer" containerID="ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c" Feb 18 06:09:31 crc kubenswrapper[4707]: E0218 06:09:31.972699 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c\": container with ID starting with ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c not found: ID does not exist" containerID="ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.972793 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c"} err="failed to get container status \"ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c\": rpc error: code = NotFound desc = could not find container \"ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c\": container with ID starting with ef5ff0be20152ed964050cb5bb229a98dc4f54793f7ab87d2ab7c175dad46e6c not found: ID does not exist" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.987683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.987769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-log-httpd\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.987909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4chw\" (UniqueName: \"kubernetes.io/projected/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-kube-api-access-p4chw\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.987936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.987984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-config-data\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.988010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-scripts\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.988168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-run-httpd\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:31 crc kubenswrapper[4707]: I0218 06:09:31.988204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.069301 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33019be9-adbc-447e-9072-71ad0fa08a14" path="/var/lib/kubelet/pods/33019be9-adbc-447e-9072-71ad0fa08a14/volumes" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.070759 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0884a8f-f290-4556-9808-47963ef4cd51" path="/var/lib/kubelet/pods/f0884a8f-f290-4556-9808-47963ef4cd51/volumes" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-run-httpd\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-log-httpd\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4chw\" (UniqueName: \"kubernetes.io/projected/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-kube-api-access-p4chw\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-config-data\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.091788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-scripts\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.092904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-run-httpd\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.093048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-log-httpd\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.097341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.099004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-scripts\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.100049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.108587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.109540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-config-data\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.112045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4chw\" (UniqueName: \"kubernetes.io/projected/a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b-kube-api-access-p4chw\") pod \"ceilometer-0\" (UID: \"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b\") " pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.226444 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.751064 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.833357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b","Type":"ContainerStarted","Data":"62b8a06b38afa32be19e1a829f93cfd77665c8d63ed29b337e2c506c6e7e362e"} Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.835123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe","Type":"ContainerStarted","Data":"7774a090c852ca56c7ba502a3aae70067dc475972aeba5aab7f8ace7b115534b"} Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.835175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe","Type":"ContainerStarted","Data":"0d1f6471fd7986ae59477f2e728be0788d5231a647a47a0dc78884cd62e61bde"} Feb 18 06:09:32 crc kubenswrapper[4707]: I0218 06:09:32.872043 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.87202498 podStartE2EDuration="2.87202498s" podCreationTimestamp="2026-02-18 06:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:32.866858411 +0000 UTC m=+1309.514817545" watchObservedRunningTime="2026-02-18 06:09:32.87202498 +0000 UTC m=+1309.519984114" Feb 18 06:09:33 crc kubenswrapper[4707]: I0218 06:09:33.390868 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:33 crc kubenswrapper[4707]: I0218 06:09:33.411765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:33 crc kubenswrapper[4707]: I0218 06:09:33.845880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b","Type":"ContainerStarted","Data":"345a40c6c9c9d2a46ff351e1422298f9cd320fb18178f77d4131934f3004f3c9"} Feb 18 06:09:33 crc kubenswrapper[4707]: I0218 06:09:33.902030 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 18 06:09:33 crc kubenswrapper[4707]: I0218 06:09:33.914292 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g6tfw"] Feb 18 06:09:33 crc kubenswrapper[4707]: I0218 06:09:33.919926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:33 crc kubenswrapper[4707]: I0218 06:09:33.942022 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6tfw"] Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.029311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-catalog-content\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.029411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-utilities\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.029483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thfrf\" (UniqueName: \"kubernetes.io/projected/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-kube-api-access-thfrf\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.131443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-catalog-content\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.131846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-utilities\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.131935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thfrf\" (UniqueName: \"kubernetes.io/projected/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-kube-api-access-thfrf\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.132609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-catalog-content\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.133473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-utilities\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.154177 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-grmw2"] Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.155718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.159223 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.159293 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.162087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thfrf\" (UniqueName: \"kubernetes.io/projected/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-kube-api-access-thfrf\") pod \"redhat-marketplace-g6tfw\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.177411 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-grmw2"] Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.234896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-scripts\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.235273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.235473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5gnr\" (UniqueName: \"kubernetes.io/projected/362e99d9-d597-4698-ba43-339a153a5ff9-kube-api-access-c5gnr\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.235549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-config-data\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.338741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.338861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5gnr\" (UniqueName: \"kubernetes.io/projected/362e99d9-d597-4698-ba43-339a153a5ff9-kube-api-access-c5gnr\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.338900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-config-data\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.338948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-scripts\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.343739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-scripts\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.345402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.345651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-config-data\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.366256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5gnr\" (UniqueName: \"kubernetes.io/projected/362e99d9-d597-4698-ba43-339a153a5ff9-kube-api-access-c5gnr\") pod \"nova-cell1-cell-mapping-grmw2\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.387867 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.526463 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.547064 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.629116 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b49846fdf-kstxv"] Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.629397 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" podUID="a3871817-d356-4035-8d85-e42993ddad4f" containerName="dnsmasq-dns" containerID="cri-o://25c61f62abd4d903048f3081d70011f274201fe023c202212ab7665591c54261" gracePeriod=10 Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.900965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b","Type":"ContainerStarted","Data":"09a402ecc541129809df47ad82198666bcb036e07b7d23106cc93cbcf8fa7b06"} Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.907950 4707 generic.go:334] "Generic (PLEG): container finished" podID="a3871817-d356-4035-8d85-e42993ddad4f" containerID="25c61f62abd4d903048f3081d70011f274201fe023c202212ab7665591c54261" exitCode=0 Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.909068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" event={"ID":"a3871817-d356-4035-8d85-e42993ddad4f","Type":"ContainerDied","Data":"25c61f62abd4d903048f3081d70011f274201fe023c202212ab7665591c54261"} Feb 18 06:09:34 crc kubenswrapper[4707]: I0218 06:09:34.955546 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6tfw"] Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.195521 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-grmw2"] Feb 18 06:09:35 crc kubenswrapper[4707]: W0218 06:09:35.196229 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod362e99d9_d597_4698_ba43_339a153a5ff9.slice/crio-37f03a19b6c481440c336b75eb9cd942a8dfd619a631372bfa2ba8bb6bea8803 WatchSource:0}: Error finding container 37f03a19b6c481440c336b75eb9cd942a8dfd619a631372bfa2ba8bb6bea8803: Status 404 returned error can't find the container with id 37f03a19b6c481440c336b75eb9cd942a8dfd619a631372bfa2ba8bb6bea8803 Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.210745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.365584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-sb\") pod \"a3871817-d356-4035-8d85-e42993ddad4f\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.366151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgksd\" (UniqueName: \"kubernetes.io/projected/a3871817-d356-4035-8d85-e42993ddad4f-kube-api-access-zgksd\") pod \"a3871817-d356-4035-8d85-e42993ddad4f\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.366197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-swift-storage-0\") pod \"a3871817-d356-4035-8d85-e42993ddad4f\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.366217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-svc\") pod \"a3871817-d356-4035-8d85-e42993ddad4f\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.366366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-config\") pod \"a3871817-d356-4035-8d85-e42993ddad4f\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.366570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-nb\") pod \"a3871817-d356-4035-8d85-e42993ddad4f\" (UID: \"a3871817-d356-4035-8d85-e42993ddad4f\") " Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.374481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3871817-d356-4035-8d85-e42993ddad4f-kube-api-access-zgksd" (OuterVolumeSpecName: "kube-api-access-zgksd") pod "a3871817-d356-4035-8d85-e42993ddad4f" (UID: "a3871817-d356-4035-8d85-e42993ddad4f"). InnerVolumeSpecName "kube-api-access-zgksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.442508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3871817-d356-4035-8d85-e42993ddad4f" (UID: "a3871817-d356-4035-8d85-e42993ddad4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.449427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-config" (OuterVolumeSpecName: "config") pod "a3871817-d356-4035-8d85-e42993ddad4f" (UID: "a3871817-d356-4035-8d85-e42993ddad4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.458904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3871817-d356-4035-8d85-e42993ddad4f" (UID: "a3871817-d356-4035-8d85-e42993ddad4f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.469317 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.469344 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgksd\" (UniqueName: \"kubernetes.io/projected/a3871817-d356-4035-8d85-e42993ddad4f-kube-api-access-zgksd\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.469379 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.469388 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.471360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3871817-d356-4035-8d85-e42993ddad4f" (UID: "a3871817-d356-4035-8d85-e42993ddad4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.515349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3871817-d356-4035-8d85-e42993ddad4f" (UID: "a3871817-d356-4035-8d85-e42993ddad4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.572390 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.572413 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3871817-d356-4035-8d85-e42993ddad4f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.951708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b","Type":"ContainerStarted","Data":"7d70c38a34ee43db4aea228a75e9c4c0f5a52f30dcbd787c935f01feb71145e7"} Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.961410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" event={"ID":"a3871817-d356-4035-8d85-e42993ddad4f","Type":"ContainerDied","Data":"fca723528a1b402a9ace874216d737bfe00a68543be3ac381cf4c208a3d14262"} Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.961513 4707 scope.go:117] "RemoveContainer" containerID="25c61f62abd4d903048f3081d70011f274201fe023c202212ab7665591c54261" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.961531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b49846fdf-kstxv" Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.973438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grmw2" event={"ID":"362e99d9-d597-4698-ba43-339a153a5ff9","Type":"ContainerStarted","Data":"130dfe5eb55c2d592f127c6835ef2eb0ed35bc28813e06ad274b0a7f679da466"} Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.973497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grmw2" event={"ID":"362e99d9-d597-4698-ba43-339a153a5ff9","Type":"ContainerStarted","Data":"37f03a19b6c481440c336b75eb9cd942a8dfd619a631372bfa2ba8bb6bea8803"} Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.975279 4707 generic.go:334] "Generic (PLEG): container finished" podID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerID="897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b" exitCode=0 Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.975324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6tfw" event={"ID":"43aa8eda-b900-47b0-af85-1b9ecb9b59c6","Type":"ContainerDied","Data":"897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b"} Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.975347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6tfw" event={"ID":"43aa8eda-b900-47b0-af85-1b9ecb9b59c6","Type":"ContainerStarted","Data":"1d4c9b3ab24c33ede829d884fc1bbc9653efbccc3275b7572e00bebfbe43b617"} Feb 18 06:09:35 crc kubenswrapper[4707]: I0218 06:09:35.997956 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-grmw2" podStartSLOduration=1.997935296 podStartE2EDuration="1.997935296s" podCreationTimestamp="2026-02-18 06:09:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:35.992229513 +0000 UTC m=+1312.640188657" watchObservedRunningTime="2026-02-18 06:09:35.997935296 +0000 UTC m=+1312.645894430" Feb 18 06:09:36 crc kubenswrapper[4707]: I0218 06:09:36.023852 4707 scope.go:117] "RemoveContainer" containerID="10d73319837d994e64db6b38221e2d65319271b8480a72f7da72676923a91e5e" Feb 18 06:09:36 crc kubenswrapper[4707]: I0218 06:09:36.031717 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b49846fdf-kstxv"] Feb 18 06:09:36 crc kubenswrapper[4707]: I0218 06:09:36.040096 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b49846fdf-kstxv"] Feb 18 06:09:36 crc kubenswrapper[4707]: I0218 06:09:36.066239 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3871817-d356-4035-8d85-e42993ddad4f" path="/var/lib/kubelet/pods/a3871817-d356-4035-8d85-e42993ddad4f/volumes" Feb 18 06:09:36 crc kubenswrapper[4707]: I0218 06:09:36.990911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6tfw" event={"ID":"43aa8eda-b900-47b0-af85-1b9ecb9b59c6","Type":"ContainerStarted","Data":"f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0"} Feb 18 06:09:38 crc kubenswrapper[4707]: I0218 06:09:38.007113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b","Type":"ContainerStarted","Data":"683cf650209a2fc36aa09ce4439ec4e6d45380e30e7e050616d12b015ef7e4a4"} Feb 18 06:09:38 crc kubenswrapper[4707]: I0218 06:09:38.007561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 06:09:38 crc kubenswrapper[4707]: I0218 06:09:38.009654 4707 generic.go:334] "Generic (PLEG): container finished" podID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerID="f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0" exitCode=0 Feb 18 06:09:38 crc kubenswrapper[4707]: I0218 06:09:38.009693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6tfw" event={"ID":"43aa8eda-b900-47b0-af85-1b9ecb9b59c6","Type":"ContainerDied","Data":"f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0"} Feb 18 06:09:38 crc kubenswrapper[4707]: I0218 06:09:38.033198 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.048954973 podStartE2EDuration="7.033179362s" podCreationTimestamp="2026-02-18 06:09:31 +0000 UTC" firstStartedPulling="2026-02-18 06:09:32.758890454 +0000 UTC m=+1309.406849598" lastFinishedPulling="2026-02-18 06:09:36.743114853 +0000 UTC m=+1313.391073987" observedRunningTime="2026-02-18 06:09:38.028127217 +0000 UTC m=+1314.676086361" watchObservedRunningTime="2026-02-18 06:09:38.033179362 +0000 UTC m=+1314.681138496" Feb 18 06:09:39 crc kubenswrapper[4707]: I0218 06:09:39.023115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6tfw" event={"ID":"43aa8eda-b900-47b0-af85-1b9ecb9b59c6","Type":"ContainerStarted","Data":"1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588"} Feb 18 06:09:39 crc kubenswrapper[4707]: I0218 06:09:39.057317 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g6tfw" podStartSLOduration=3.23136375 podStartE2EDuration="6.057300135s" podCreationTimestamp="2026-02-18 06:09:33 +0000 UTC" firstStartedPulling="2026-02-18 06:09:35.977124347 +0000 UTC m=+1312.625083481" lastFinishedPulling="2026-02-18 06:09:38.803060732 +0000 UTC m=+1315.451019866" observedRunningTime="2026-02-18 06:09:39.047439591 +0000 UTC m=+1315.695398725" watchObservedRunningTime="2026-02-18 06:09:39.057300135 +0000 UTC m=+1315.705259269" Feb 18 06:09:41 crc kubenswrapper[4707]: I0218 06:09:41.046295 4707 generic.go:334] "Generic (PLEG): container finished" podID="362e99d9-d597-4698-ba43-339a153a5ff9" containerID="130dfe5eb55c2d592f127c6835ef2eb0ed35bc28813e06ad274b0a7f679da466" exitCode=0 Feb 18 06:09:41 crc kubenswrapper[4707]: I0218 06:09:41.046468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grmw2" event={"ID":"362e99d9-d597-4698-ba43-339a153a5ff9","Type":"ContainerDied","Data":"130dfe5eb55c2d592f127c6835ef2eb0ed35bc28813e06ad274b0a7f679da466"} Feb 18 06:09:41 crc kubenswrapper[4707]: I0218 06:09:41.234039 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:09:41 crc kubenswrapper[4707]: I0218 06:09:41.234117 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.253093 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.253139 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.435956 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.532119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5gnr\" (UniqueName: \"kubernetes.io/projected/362e99d9-d597-4698-ba43-339a153a5ff9-kube-api-access-c5gnr\") pod \"362e99d9-d597-4698-ba43-339a153a5ff9\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.532367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-combined-ca-bundle\") pod \"362e99d9-d597-4698-ba43-339a153a5ff9\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.532457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-scripts\") pod \"362e99d9-d597-4698-ba43-339a153a5ff9\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.532536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-config-data\") pod \"362e99d9-d597-4698-ba43-339a153a5ff9\" (UID: \"362e99d9-d597-4698-ba43-339a153a5ff9\") " Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.538162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362e99d9-d597-4698-ba43-339a153a5ff9-kube-api-access-c5gnr" (OuterVolumeSpecName: "kube-api-access-c5gnr") pod "362e99d9-d597-4698-ba43-339a153a5ff9" (UID: "362e99d9-d597-4698-ba43-339a153a5ff9"). InnerVolumeSpecName "kube-api-access-c5gnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.550990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-scripts" (OuterVolumeSpecName: "scripts") pod "362e99d9-d597-4698-ba43-339a153a5ff9" (UID: "362e99d9-d597-4698-ba43-339a153a5ff9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.563096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-config-data" (OuterVolumeSpecName: "config-data") pod "362e99d9-d597-4698-ba43-339a153a5ff9" (UID: "362e99d9-d597-4698-ba43-339a153a5ff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.564031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "362e99d9-d597-4698-ba43-339a153a5ff9" (UID: "362e99d9-d597-4698-ba43-339a153a5ff9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.635162 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5gnr\" (UniqueName: \"kubernetes.io/projected/362e99d9-d597-4698-ba43-339a153a5ff9-kube-api-access-c5gnr\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.635213 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.635229 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:42 crc kubenswrapper[4707]: I0218 06:09:42.635242 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/362e99d9-d597-4698-ba43-339a153a5ff9-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.069919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grmw2" event={"ID":"362e99d9-d597-4698-ba43-339a153a5ff9","Type":"ContainerDied","Data":"37f03a19b6c481440c336b75eb9cd942a8dfd619a631372bfa2ba8bb6bea8803"} Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.069985 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37f03a19b6c481440c336b75eb9cd942a8dfd619a631372bfa2ba8bb6bea8803" Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.070044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grmw2" Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.276504 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.277325 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-log" containerID="cri-o://0d1f6471fd7986ae59477f2e728be0788d5231a647a47a0dc78884cd62e61bde" gracePeriod=30 Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.277438 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-api" containerID="cri-o://7774a090c852ca56c7ba502a3aae70067dc475972aeba5aab7f8ace7b115534b" gracePeriod=30 Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.325613 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.326598 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-log" containerID="cri-o://4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31" gracePeriod=30 Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.326832 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-metadata" containerID="cri-o://680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814" gracePeriod=30 Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.347612 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:43 crc kubenswrapper[4707]: I0218 06:09:43.347927 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a64cc687-253e-4b0b-8394-a85a561826cc" containerName="nova-scheduler-scheduler" containerID="cri-o://f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79" gracePeriod=30 Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.079527 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerID="0d1f6471fd7986ae59477f2e728be0788d5231a647a47a0dc78884cd62e61bde" exitCode=143 Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.079603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe","Type":"ContainerDied","Data":"0d1f6471fd7986ae59477f2e728be0788d5231a647a47a0dc78884cd62e61bde"} Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.083265 4707 generic.go:334] "Generic (PLEG): container finished" podID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerID="4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31" exitCode=143 Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.083323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55fa9858-323a-452c-a0aa-7a1207e40ca2","Type":"ContainerDied","Data":"4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31"} Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.388505 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.388871 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.438509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.781828 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.786810 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vhmd\" (UniqueName: \"kubernetes.io/projected/a64cc687-253e-4b0b-8394-a85a561826cc-kube-api-access-7vhmd\") pod \"a64cc687-253e-4b0b-8394-a85a561826cc\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.787073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-config-data\") pod \"a64cc687-253e-4b0b-8394-a85a561826cc\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.787311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-combined-ca-bundle\") pod \"a64cc687-253e-4b0b-8394-a85a561826cc\" (UID: \"a64cc687-253e-4b0b-8394-a85a561826cc\") " Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.800606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64cc687-253e-4b0b-8394-a85a561826cc-kube-api-access-7vhmd" (OuterVolumeSpecName: "kube-api-access-7vhmd") pod "a64cc687-253e-4b0b-8394-a85a561826cc" (UID: "a64cc687-253e-4b0b-8394-a85a561826cc"). InnerVolumeSpecName "kube-api-access-7vhmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.839072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-config-data" (OuterVolumeSpecName: "config-data") pod "a64cc687-253e-4b0b-8394-a85a561826cc" (UID: "a64cc687-253e-4b0b-8394-a85a561826cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.844992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a64cc687-253e-4b0b-8394-a85a561826cc" (UID: "a64cc687-253e-4b0b-8394-a85a561826cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.888557 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.888594 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vhmd\" (UniqueName: \"kubernetes.io/projected/a64cc687-253e-4b0b-8394-a85a561826cc-kube-api-access-7vhmd\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:44 crc kubenswrapper[4707]: I0218 06:09:44.888606 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64cc687-253e-4b0b-8394-a85a561826cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.095216 4707 generic.go:334] "Generic (PLEG): container finished" podID="a64cc687-253e-4b0b-8394-a85a561826cc" containerID="f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79" exitCode=0 Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.095907 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.098310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a64cc687-253e-4b0b-8394-a85a561826cc","Type":"ContainerDied","Data":"f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79"} Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.098348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a64cc687-253e-4b0b-8394-a85a561826cc","Type":"ContainerDied","Data":"9b2ca0db34405feec2ef54155195010a2e77309491ecd761d8a638973b1f9537"} Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.098364 4707 scope.go:117] "RemoveContainer" containerID="f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.136043 4707 scope.go:117] "RemoveContainer" containerID="f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79" Feb 18 06:09:45 crc kubenswrapper[4707]: E0218 06:09:45.136859 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79\": container with ID starting with f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79 not found: ID does not exist" containerID="f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.136901 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79"} err="failed to get container status \"f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79\": rpc error: code = NotFound desc = could not find container \"f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79\": container with ID starting with f44f7f7f5f26f09ec2b7b174b8bbcaca45a2e36f1a904fa46d26955d94156d79 not found: ID does not exist" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.140458 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.148623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.167275 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.179696 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:45 crc kubenswrapper[4707]: E0218 06:09:45.180252 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3871817-d356-4035-8d85-e42993ddad4f" containerName="init" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.180278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3871817-d356-4035-8d85-e42993ddad4f" containerName="init" Feb 18 06:09:45 crc kubenswrapper[4707]: E0218 06:09:45.180295 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362e99d9-d597-4698-ba43-339a153a5ff9" containerName="nova-manage" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.180305 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="362e99d9-d597-4698-ba43-339a153a5ff9" containerName="nova-manage" Feb 18 06:09:45 crc kubenswrapper[4707]: E0218 06:09:45.180351 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3871817-d356-4035-8d85-e42993ddad4f" containerName="dnsmasq-dns" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.180361 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3871817-d356-4035-8d85-e42993ddad4f" containerName="dnsmasq-dns" Feb 18 06:09:45 crc kubenswrapper[4707]: E0218 06:09:45.180381 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64cc687-253e-4b0b-8394-a85a561826cc" containerName="nova-scheduler-scheduler" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.180390 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64cc687-253e-4b0b-8394-a85a561826cc" containerName="nova-scheduler-scheduler" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.180629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64cc687-253e-4b0b-8394-a85a561826cc" containerName="nova-scheduler-scheduler" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.180660 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="362e99d9-d597-4698-ba43-339a153a5ff9" containerName="nova-manage" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.180681 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3871817-d356-4035-8d85-e42993ddad4f" containerName="dnsmasq-dns" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.181512 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.185742 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.191178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.224273 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6tfw"] Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.296959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda1514b-6a18-4c59-8d92-4168f4dc589f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.297842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtc7\" (UniqueName: \"kubernetes.io/projected/cda1514b-6a18-4c59-8d92-4168f4dc589f-kube-api-access-lhtc7\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.297962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda1514b-6a18-4c59-8d92-4168f4dc589f-config-data\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.400297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtc7\" (UniqueName: \"kubernetes.io/projected/cda1514b-6a18-4c59-8d92-4168f4dc589f-kube-api-access-lhtc7\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.400360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda1514b-6a18-4c59-8d92-4168f4dc589f-config-data\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.400524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda1514b-6a18-4c59-8d92-4168f4dc589f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.404383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda1514b-6a18-4c59-8d92-4168f4dc589f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.405615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda1514b-6a18-4c59-8d92-4168f4dc589f-config-data\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.419704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtc7\" (UniqueName: \"kubernetes.io/projected/cda1514b-6a18-4c59-8d92-4168f4dc589f-kube-api-access-lhtc7\") pod \"nova-scheduler-0\" (UID: \"cda1514b-6a18-4c59-8d92-4168f4dc589f\") " pod="openstack/nova-scheduler-0" Feb 18 06:09:45 crc kubenswrapper[4707]: I0218 06:09:45.504032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 06:09:46 crc kubenswrapper[4707]: I0218 06:09:45.998106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 06:09:46 crc kubenswrapper[4707]: I0218 06:09:46.065039 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64cc687-253e-4b0b-8394-a85a561826cc" path="/var/lib/kubelet/pods/a64cc687-253e-4b0b-8394-a85a561826cc/volumes" Feb 18 06:09:46 crc kubenswrapper[4707]: I0218 06:09:46.107508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cda1514b-6a18-4c59-8d92-4168f4dc589f","Type":"ContainerStarted","Data":"09cfa799d19ee1e670b9a928daf1f3fda1fcf2391acae342be640111ae7b2fa1"} Feb 18 06:09:46 crc kubenswrapper[4707]: I0218 06:09:46.465532 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:39714->10.217.0.215:8775: read: connection reset by peer" Feb 18 06:09:46 crc kubenswrapper[4707]: I0218 06:09:46.465580 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:39720->10.217.0.215:8775: read: connection reset by peer" Feb 18 06:09:46 crc kubenswrapper[4707]: I0218 06:09:46.996252 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.120535 4707 generic.go:334] "Generic (PLEG): container finished" podID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerID="680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814" exitCode=0 Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.120568 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.120598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55fa9858-323a-452c-a0aa-7a1207e40ca2","Type":"ContainerDied","Data":"680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814"} Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.120626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55fa9858-323a-452c-a0aa-7a1207e40ca2","Type":"ContainerDied","Data":"24f597ba259204c97b1c040feb9ac8eda9271a28ddf3d45d67cbe61829e543e2"} Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.120646 4707 scope.go:117] "RemoveContainer" containerID="680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.122780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cda1514b-6a18-4c59-8d92-4168f4dc589f","Type":"ContainerStarted","Data":"b1e90aa09b143c37ae936da4c3df324321582ae8b196e7ab6b31ac0bb6df1660"} Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.122937 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g6tfw" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="registry-server" containerID="cri-o://1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588" gracePeriod=2 Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.141765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-nova-metadata-tls-certs\") pod \"55fa9858-323a-452c-a0aa-7a1207e40ca2\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.141834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc9sz\" (UniqueName: \"kubernetes.io/projected/55fa9858-323a-452c-a0aa-7a1207e40ca2-kube-api-access-sc9sz\") pod \"55fa9858-323a-452c-a0aa-7a1207e40ca2\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.141875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-config-data\") pod \"55fa9858-323a-452c-a0aa-7a1207e40ca2\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.141934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fa9858-323a-452c-a0aa-7a1207e40ca2-logs\") pod \"55fa9858-323a-452c-a0aa-7a1207e40ca2\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.142062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-combined-ca-bundle\") pod \"55fa9858-323a-452c-a0aa-7a1207e40ca2\" (UID: \"55fa9858-323a-452c-a0aa-7a1207e40ca2\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.145144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55fa9858-323a-452c-a0aa-7a1207e40ca2-logs" (OuterVolumeSpecName: "logs") pod "55fa9858-323a-452c-a0aa-7a1207e40ca2" (UID: "55fa9858-323a-452c-a0aa-7a1207e40ca2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.152307 4707 scope.go:117] "RemoveContainer" containerID="4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.152504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fa9858-323a-452c-a0aa-7a1207e40ca2-kube-api-access-sc9sz" (OuterVolumeSpecName: "kube-api-access-sc9sz") pod "55fa9858-323a-452c-a0aa-7a1207e40ca2" (UID: "55fa9858-323a-452c-a0aa-7a1207e40ca2"). InnerVolumeSpecName "kube-api-access-sc9sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.165775 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.165751919 podStartE2EDuration="2.165751919s" podCreationTimestamp="2026-02-18 06:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:47.154033304 +0000 UTC m=+1323.801992438" watchObservedRunningTime="2026-02-18 06:09:47.165751919 +0000 UTC m=+1323.813711053" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.182080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-config-data" (OuterVolumeSpecName: "config-data") pod "55fa9858-323a-452c-a0aa-7a1207e40ca2" (UID: "55fa9858-323a-452c-a0aa-7a1207e40ca2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.210961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55fa9858-323a-452c-a0aa-7a1207e40ca2" (UID: "55fa9858-323a-452c-a0aa-7a1207e40ca2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.215754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "55fa9858-323a-452c-a0aa-7a1207e40ca2" (UID: "55fa9858-323a-452c-a0aa-7a1207e40ca2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.245648 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.245685 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc9sz\" (UniqueName: \"kubernetes.io/projected/55fa9858-323a-452c-a0aa-7a1207e40ca2-kube-api-access-sc9sz\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.245695 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.245705 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fa9858-323a-452c-a0aa-7a1207e40ca2-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.245714 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fa9858-323a-452c-a0aa-7a1207e40ca2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.382209 4707 scope.go:117] "RemoveContainer" containerID="680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814" Feb 18 06:09:47 crc kubenswrapper[4707]: E0218 06:09:47.383530 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814\": container with ID starting with 680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814 not found: ID does not exist" containerID="680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.383580 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814"} err="failed to get container status \"680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814\": rpc error: code = NotFound desc = could not find container \"680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814\": container with ID starting with 680e1951e8399648b3d0f8d9c14c6c49b6db1bd323bbc3d21089d1b5ed10c814 not found: ID does not exist" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.383611 4707 scope.go:117] "RemoveContainer" containerID="4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31" Feb 18 06:09:47 crc kubenswrapper[4707]: E0218 06:09:47.384080 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31\": container with ID starting with 4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31 not found: ID does not exist" containerID="4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.384109 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31"} err="failed to get container status \"4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31\": rpc error: code = NotFound desc = could not find container \"4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31\": container with ID starting with 4ab9e84da55c5bc8fac96862c4ecc3e0bf326eeb8f3486fdfc1fe3592ad91f31 not found: ID does not exist" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.499575 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.524937 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.547943 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:09:47 crc kubenswrapper[4707]: E0218 06:09:47.548309 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-log" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.548322 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-log" Feb 18 06:09:47 crc kubenswrapper[4707]: E0218 06:09:47.548349 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-metadata" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.548354 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-metadata" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.548531 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-metadata" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.548554 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" containerName="nova-metadata-log" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.549472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.554517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.554769 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.568041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.652100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-logs\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.652193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98md2\" (UniqueName: \"kubernetes.io/projected/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-kube-api-access-98md2\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.652260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.652320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.652362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-config-data\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.722253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.754014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.754097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-config-data\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.754164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-logs\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.754208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98md2\" (UniqueName: \"kubernetes.io/projected/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-kube-api-access-98md2\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.754262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.755604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-logs\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.762432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.762560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-config-data\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.776392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.776532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98md2\" (UniqueName: \"kubernetes.io/projected/ca2ba934-fce0-4bc1-af4e-27d758f7aef6-kube-api-access-98md2\") pod \"nova-metadata-0\" (UID: \"ca2ba934-fce0-4bc1-af4e-27d758f7aef6\") " pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.855252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-catalog-content\") pod \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.855574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thfrf\" (UniqueName: \"kubernetes.io/projected/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-kube-api-access-thfrf\") pod \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.855665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-utilities\") pod \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\" (UID: \"43aa8eda-b900-47b0-af85-1b9ecb9b59c6\") " Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.856774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-utilities" (OuterVolumeSpecName: "utilities") pod "43aa8eda-b900-47b0-af85-1b9ecb9b59c6" (UID: "43aa8eda-b900-47b0-af85-1b9ecb9b59c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.863229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-kube-api-access-thfrf" (OuterVolumeSpecName: "kube-api-access-thfrf") pod "43aa8eda-b900-47b0-af85-1b9ecb9b59c6" (UID: "43aa8eda-b900-47b0-af85-1b9ecb9b59c6"). InnerVolumeSpecName "kube-api-access-thfrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.879946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43aa8eda-b900-47b0-af85-1b9ecb9b59c6" (UID: "43aa8eda-b900-47b0-af85-1b9ecb9b59c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.907529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.983137 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thfrf\" (UniqueName: \"kubernetes.io/projected/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-kube-api-access-thfrf\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.983435 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:47 crc kubenswrapper[4707]: I0218 06:09:47.983454 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43aa8eda-b900-47b0-af85-1b9ecb9b59c6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.065106 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55fa9858-323a-452c-a0aa-7a1207e40ca2" path="/var/lib/kubelet/pods/55fa9858-323a-452c-a0aa-7a1207e40ca2/volumes" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.136179 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerID="7774a090c852ca56c7ba502a3aae70067dc475972aeba5aab7f8ace7b115534b" exitCode=0 Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.136261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe","Type":"ContainerDied","Data":"7774a090c852ca56c7ba502a3aae70067dc475972aeba5aab7f8ace7b115534b"} Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.140007 4707 generic.go:334] "Generic (PLEG): container finished" podID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerID="1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588" exitCode=0 Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.140530 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6tfw" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.141197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6tfw" event={"ID":"43aa8eda-b900-47b0-af85-1b9ecb9b59c6","Type":"ContainerDied","Data":"1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588"} Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.141227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6tfw" event={"ID":"43aa8eda-b900-47b0-af85-1b9ecb9b59c6","Type":"ContainerDied","Data":"1d4c9b3ab24c33ede829d884fc1bbc9653efbccc3275b7572e00bebfbe43b617"} Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.141246 4707 scope.go:117] "RemoveContainer" containerID="1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.198779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.210282 4707 scope.go:117] "RemoveContainer" containerID="f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.217817 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6tfw"] Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.237687 4707 scope.go:117] "RemoveContainer" containerID="897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.241854 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6tfw"] Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.278915 4707 scope.go:117] "RemoveContainer" containerID="1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588" Feb 18 06:09:48 crc kubenswrapper[4707]: E0218 06:09:48.279316 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588\": container with ID starting with 1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588 not found: ID does not exist" containerID="1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.279346 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588"} err="failed to get container status \"1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588\": rpc error: code = NotFound desc = could not find container \"1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588\": container with ID starting with 1a7a631dbe530ca84d7b3ac2d6eab3841daf8f344d595c920f4a0ec5d7fcc588 not found: ID does not exist" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.279367 4707 scope.go:117] "RemoveContainer" containerID="f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0" Feb 18 06:09:48 crc kubenswrapper[4707]: E0218 06:09:48.279857 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0\": container with ID starting with f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0 not found: ID does not exist" containerID="f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.279893 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0"} err="failed to get container status \"f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0\": rpc error: code = NotFound desc = could not find container \"f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0\": container with ID starting with f949a0a3c4afc9e3381dd1bf29ad665e74664c833b95eb59c5aa907d6f6f59b0 not found: ID does not exist" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.279907 4707 scope.go:117] "RemoveContainer" containerID="897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b" Feb 18 06:09:48 crc kubenswrapper[4707]: E0218 06:09:48.280114 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b\": container with ID starting with 897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b not found: ID does not exist" containerID="897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.280155 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b"} err="failed to get container status \"897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b\": rpc error: code = NotFound desc = could not find container \"897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b\": container with ID starting with 897e21d74da0dda5b3f022fc8fba570fb6686ffa29092e78498eddea7646ff9b not found: ID does not exist" Feb 18 06:09:48 crc kubenswrapper[4707]: W0218 06:09:48.360243 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca2ba934_fce0_4bc1_af4e_27d758f7aef6.slice/crio-c020226fe42d9e72916033966d35e3e9f795a005effce25d6c9ecb838d4df994 WatchSource:0}: Error finding container c020226fe42d9e72916033966d35e3e9f795a005effce25d6c9ecb838d4df994: Status 404 returned error can't find the container with id c020226fe42d9e72916033966d35e3e9f795a005effce25d6c9ecb838d4df994 Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.365034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.390002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-combined-ca-bundle\") pod \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.390068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-public-tls-certs\") pod \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.390191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc58p\" (UniqueName: \"kubernetes.io/projected/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-kube-api-access-jc58p\") pod \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.390243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-logs\") pod \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.390338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-config-data\") pod \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.390449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-internal-tls-certs\") pod \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\" (UID: \"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe\") " Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.390769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-logs" (OuterVolumeSpecName: "logs") pod "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" (UID: "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.391244 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-logs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.395287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-kube-api-access-jc58p" (OuterVolumeSpecName: "kube-api-access-jc58p") pod "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" (UID: "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe"). InnerVolumeSpecName "kube-api-access-jc58p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.419599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" (UID: "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.436654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-config-data" (OuterVolumeSpecName: "config-data") pod "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" (UID: "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.448057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" (UID: "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.471023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" (UID: "9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.493671 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.493708 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.493718 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.493753 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc58p\" (UniqueName: \"kubernetes.io/projected/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-kube-api-access-jc58p\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:48 crc kubenswrapper[4707]: I0218 06:09:48.493766 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.158527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca2ba934-fce0-4bc1-af4e-27d758f7aef6","Type":"ContainerStarted","Data":"7de4ad12e9eaa8758b34be539599218133e05f95e2e47149ca25c4c09f236865"} Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.158897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca2ba934-fce0-4bc1-af4e-27d758f7aef6","Type":"ContainerStarted","Data":"193ff537b105d3dee4bb137e3bbd7b8eb602bcd21967d717d39728dfb0e906ca"} Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.158914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca2ba934-fce0-4bc1-af4e-27d758f7aef6","Type":"ContainerStarted","Data":"c020226fe42d9e72916033966d35e3e9f795a005effce25d6c9ecb838d4df994"} Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.160962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe","Type":"ContainerDied","Data":"c3472d9535cb6dffc73a720a73fb9b12ed5d47366a5b8edef35d442ad0bbb13f"} Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.161012 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.161019 4707 scope.go:117] "RemoveContainer" containerID="7774a090c852ca56c7ba502a3aae70067dc475972aeba5aab7f8ace7b115534b" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.189100 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.189079626 podStartE2EDuration="2.189079626s" podCreationTimestamp="2026-02-18 06:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:49.179345004 +0000 UTC m=+1325.827304148" watchObservedRunningTime="2026-02-18 06:09:49.189079626 +0000 UTC m=+1325.837038760" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.203441 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.222612 4707 scope.go:117] "RemoveContainer" containerID="0d1f6471fd7986ae59477f2e728be0788d5231a647a47a0dc78884cd62e61bde" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.226890 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.243394 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:49 crc kubenswrapper[4707]: E0218 06:09:49.243861 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="registry-server" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.243882 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="registry-server" Feb 18 06:09:49 crc kubenswrapper[4707]: E0218 06:09:49.243897 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-log" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.243905 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-log" Feb 18 06:09:49 crc kubenswrapper[4707]: E0218 06:09:49.243931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="extract-content" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.243939 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="extract-content" Feb 18 06:09:49 crc kubenswrapper[4707]: E0218 06:09:49.243955 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-api" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.243962 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-api" Feb 18 06:09:49 crc kubenswrapper[4707]: E0218 06:09:49.243973 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="extract-utilities" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.243981 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="extract-utilities" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.244208 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-api" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.244234 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" containerName="nova-api-log" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.244253 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" containerName="registry-server" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.245506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.248827 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.249033 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.249052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.261052 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.310594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.310679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmlzk\" (UniqueName: \"kubernetes.io/projected/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-kube-api-access-tmlzk\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.310889 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.310998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.311123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-logs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.311250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-config-data\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.413343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.413435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-logs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.413530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-config-data\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.413597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.413706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmlzk\" (UniqueName: \"kubernetes.io/projected/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-kube-api-access-tmlzk\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.413790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.414231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-logs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.419382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.419526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.421822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-config-data\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.427971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.431530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmlzk\" (UniqueName: \"kubernetes.io/projected/ff7a9ac0-e9b0-4497-ad55-18768ff36da1-kube-api-access-tmlzk\") pod \"nova-api-0\" (UID: \"ff7a9ac0-e9b0-4497-ad55-18768ff36da1\") " pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.570993 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 06:09:49 crc kubenswrapper[4707]: I0218 06:09:49.984000 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 06:09:49 crc kubenswrapper[4707]: W0218 06:09:49.989981 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff7a9ac0_e9b0_4497_ad55_18768ff36da1.slice/crio-88b3667e7b6653feebbf2bfbfb25bb84188b46a2a9afdaf15041bcccc3527730 WatchSource:0}: Error finding container 88b3667e7b6653feebbf2bfbfb25bb84188b46a2a9afdaf15041bcccc3527730: Status 404 returned error can't find the container with id 88b3667e7b6653feebbf2bfbfb25bb84188b46a2a9afdaf15041bcccc3527730 Feb 18 06:09:50 crc kubenswrapper[4707]: I0218 06:09:50.066285 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43aa8eda-b900-47b0-af85-1b9ecb9b59c6" path="/var/lib/kubelet/pods/43aa8eda-b900-47b0-af85-1b9ecb9b59c6/volumes" Feb 18 06:09:50 crc kubenswrapper[4707]: I0218 06:09:50.067591 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe" path="/var/lib/kubelet/pods/9cfffd59-4d3f-4fb5-a85a-3a38edc3d8fe/volumes" Feb 18 06:09:50 crc kubenswrapper[4707]: I0218 06:09:50.174010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff7a9ac0-e9b0-4497-ad55-18768ff36da1","Type":"ContainerStarted","Data":"88b3667e7b6653feebbf2bfbfb25bb84188b46a2a9afdaf15041bcccc3527730"} Feb 18 06:09:50 crc kubenswrapper[4707]: I0218 06:09:50.504313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 06:09:51 crc kubenswrapper[4707]: I0218 06:09:51.186743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff7a9ac0-e9b0-4497-ad55-18768ff36da1","Type":"ContainerStarted","Data":"00cb6d1d122499e285cf338332cc22d50084850612750b1ce585bbffab672503"} Feb 18 06:09:51 crc kubenswrapper[4707]: I0218 06:09:51.186785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff7a9ac0-e9b0-4497-ad55-18768ff36da1","Type":"ContainerStarted","Data":"890e684ffd3f47f476df14989b9ed7649d65b1abbfc1ee0a17746f311c524c7d"} Feb 18 06:09:51 crc kubenswrapper[4707]: I0218 06:09:51.210618 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.210602274 podStartE2EDuration="2.210602274s" podCreationTimestamp="2026-02-18 06:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:09:51.208109187 +0000 UTC m=+1327.856068411" watchObservedRunningTime="2026-02-18 06:09:51.210602274 +0000 UTC m=+1327.858561408" Feb 18 06:09:52 crc kubenswrapper[4707]: I0218 06:09:52.908717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:09:52 crc kubenswrapper[4707]: I0218 06:09:52.909086 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 06:09:55 crc kubenswrapper[4707]: I0218 06:09:55.504773 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 06:09:55 crc kubenswrapper[4707]: I0218 06:09:55.536117 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 06:09:56 crc kubenswrapper[4707]: I0218 06:09:56.258029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 06:09:57 crc kubenswrapper[4707]: I0218 06:09:57.909038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:09:57 crc kubenswrapper[4707]: I0218 06:09:57.909360 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 06:09:58 crc kubenswrapper[4707]: I0218 06:09:58.923924 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ca2ba934-fce0-4bc1-af4e-27d758f7aef6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:58 crc kubenswrapper[4707]: I0218 06:09:58.923922 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ca2ba934-fce0-4bc1-af4e-27d758f7aef6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:09:59 crc kubenswrapper[4707]: I0218 06:09:59.571979 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:09:59 crc kubenswrapper[4707]: I0218 06:09:59.572206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 06:10:00 crc kubenswrapper[4707]: I0218 06:10:00.583940 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff7a9ac0-e9b0-4497-ad55-18768ff36da1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:10:00 crc kubenswrapper[4707]: I0218 06:10:00.584031 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff7a9ac0-e9b0-4497-ad55-18768ff36da1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 06:10:02 crc kubenswrapper[4707]: I0218 06:10:02.237165 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 06:10:07 crc kubenswrapper[4707]: I0218 06:10:07.914480 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:10:07 crc kubenswrapper[4707]: I0218 06:10:07.921072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:10:07 crc kubenswrapper[4707]: I0218 06:10:07.923391 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 06:10:08 crc kubenswrapper[4707]: I0218 06:10:08.361768 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 06:10:09 crc kubenswrapper[4707]: I0218 06:10:09.578289 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:10:09 crc kubenswrapper[4707]: I0218 06:10:09.579385 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:10:09 crc kubenswrapper[4707]: I0218 06:10:09.584678 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 06:10:09 crc kubenswrapper[4707]: I0218 06:10:09.591615 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:10:10 crc kubenswrapper[4707]: I0218 06:10:10.374254 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 06:10:10 crc kubenswrapper[4707]: I0218 06:10:10.383835 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 06:10:17 crc kubenswrapper[4707]: I0218 06:10:17.753955 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:10:18 crc kubenswrapper[4707]: I0218 06:10:18.909456 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:10:21 crc kubenswrapper[4707]: I0218 06:10:21.563628 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerName="rabbitmq" containerID="cri-o://fd972a2d3d4bd78031c578208b45afb2e0d2ec70227d127c35345986c0d1abd7" gracePeriod=604797 Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.018373 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5f5jr"] Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.021454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.031460 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f5jr"] Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.213401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-catalog-content\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.213850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj9qg\" (UniqueName: \"kubernetes.io/projected/14f1fa8e-a809-4a16-a85d-338d18628a8a-kube-api-access-qj9qg\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.213934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-utilities\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.318089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-catalog-content\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.318216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj9qg\" (UniqueName: \"kubernetes.io/projected/14f1fa8e-a809-4a16-a85d-338d18628a8a-kube-api-access-qj9qg\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.318247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-utilities\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.318731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-catalog-content\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.318767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-utilities\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.341682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj9qg\" (UniqueName: \"kubernetes.io/projected/14f1fa8e-a809-4a16-a85d-338d18628a8a-kube-api-access-qj9qg\") pod \"redhat-operators-5f5jr\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.356388 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:22 crc kubenswrapper[4707]: I0218 06:10:22.873584 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f5jr"] Feb 18 06:10:23 crc kubenswrapper[4707]: I0218 06:10:23.520303 4707 generic.go:334] "Generic (PLEG): container finished" podID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerID="8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d" exitCode=0 Feb 18 06:10:23 crc kubenswrapper[4707]: I0218 06:10:23.520577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f5jr" event={"ID":"14f1fa8e-a809-4a16-a85d-338d18628a8a","Type":"ContainerDied","Data":"8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d"} Feb 18 06:10:23 crc kubenswrapper[4707]: I0218 06:10:23.520635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f5jr" event={"ID":"14f1fa8e-a809-4a16-a85d-338d18628a8a","Type":"ContainerStarted","Data":"da128c62e70eebeb4d053567db0c640238924015067f8e50cf8fa8b91cf365d8"} Feb 18 06:10:23 crc kubenswrapper[4707]: I0218 06:10:23.587859 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerName="rabbitmq" containerID="cri-o://2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf" gracePeriod=604796 Feb 18 06:10:24 crc kubenswrapper[4707]: I0218 06:10:24.079072 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Feb 18 06:10:24 crc kubenswrapper[4707]: I0218 06:10:24.467096 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Feb 18 06:10:24 crc kubenswrapper[4707]: I0218 06:10:24.530023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f5jr" event={"ID":"14f1fa8e-a809-4a16-a85d-338d18628a8a","Type":"ContainerStarted","Data":"f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411"} Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.792323 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t8wwk"] Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.794628 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.806564 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8wwk"] Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.893072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-catalog-content\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.893249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-utilities\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.893329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsndj\" (UniqueName: \"kubernetes.io/projected/7878b968-69b2-47a7-ba2c-de8d793eb24a-kube-api-access-dsndj\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.995476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-utilities\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.995623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsndj\" (UniqueName: \"kubernetes.io/projected/7878b968-69b2-47a7-ba2c-de8d793eb24a-kube-api-access-dsndj\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.995782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-catalog-content\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.996151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-utilities\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:25 crc kubenswrapper[4707]: I0218 06:10:25.996249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-catalog-content\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:26 crc kubenswrapper[4707]: I0218 06:10:26.019605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsndj\" (UniqueName: \"kubernetes.io/projected/7878b968-69b2-47a7-ba2c-de8d793eb24a-kube-api-access-dsndj\") pod \"community-operators-t8wwk\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:26 crc kubenswrapper[4707]: I0218 06:10:26.134974 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:26 crc kubenswrapper[4707]: I0218 06:10:26.716562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8wwk"] Feb 18 06:10:26 crc kubenswrapper[4707]: W0218 06:10:26.753488 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7878b968_69b2_47a7_ba2c_de8d793eb24a.slice/crio-7b74943c62e5e7264cf2a049255c5d38575fe675cb81936ad1404406a5fc3192 WatchSource:0}: Error finding container 7b74943c62e5e7264cf2a049255c5d38575fe675cb81936ad1404406a5fc3192: Status 404 returned error can't find the container with id 7b74943c62e5e7264cf2a049255c5d38575fe675cb81936ad1404406a5fc3192 Feb 18 06:10:27 crc kubenswrapper[4707]: I0218 06:10:27.576334 4707 generic.go:334] "Generic (PLEG): container finished" podID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerID="f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411" exitCode=0 Feb 18 06:10:27 crc kubenswrapper[4707]: I0218 06:10:27.576390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f5jr" event={"ID":"14f1fa8e-a809-4a16-a85d-338d18628a8a","Type":"ContainerDied","Data":"f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411"} Feb 18 06:10:27 crc kubenswrapper[4707]: I0218 06:10:27.579546 4707 generic.go:334] "Generic (PLEG): container finished" podID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerID="54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6" exitCode=0 Feb 18 06:10:27 crc kubenswrapper[4707]: I0218 06:10:27.579636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8wwk" event={"ID":"7878b968-69b2-47a7-ba2c-de8d793eb24a","Type":"ContainerDied","Data":"54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6"} Feb 18 06:10:27 crc kubenswrapper[4707]: I0218 06:10:27.579703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8wwk" event={"ID":"7878b968-69b2-47a7-ba2c-de8d793eb24a","Type":"ContainerStarted","Data":"7b74943c62e5e7264cf2a049255c5d38575fe675cb81936ad1404406a5fc3192"} Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.609425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8wwk" event={"ID":"7878b968-69b2-47a7-ba2c-de8d793eb24a","Type":"ContainerStarted","Data":"76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f"} Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.615543 4707 generic.go:334] "Generic (PLEG): container finished" podID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerID="fd972a2d3d4bd78031c578208b45afb2e0d2ec70227d127c35345986c0d1abd7" exitCode=0 Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.615603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"298a4b48-6611-4cb4-8ccf-e9a00c23622b","Type":"ContainerDied","Data":"fd972a2d3d4bd78031c578208b45afb2e0d2ec70227d127c35345986c0d1abd7"} Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.617912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f5jr" event={"ID":"14f1fa8e-a809-4a16-a85d-338d18628a8a","Type":"ContainerStarted","Data":"9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1"} Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.656763 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5f5jr" podStartSLOduration=3.202440275 podStartE2EDuration="7.656742348s" podCreationTimestamp="2026-02-18 06:10:21 +0000 UTC" firstStartedPulling="2026-02-18 06:10:23.522386275 +0000 UTC m=+1360.170345409" lastFinishedPulling="2026-02-18 06:10:27.976688348 +0000 UTC m=+1364.624647482" observedRunningTime="2026-02-18 06:10:28.652220806 +0000 UTC m=+1365.300179940" watchObservedRunningTime="2026-02-18 06:10:28.656742348 +0000 UTC m=+1365.304701482" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.838579 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.889498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/298a4b48-6611-4cb4-8ccf-e9a00c23622b-pod-info\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.889842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-config-data\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.889955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-tls\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.890030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-plugins\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.894561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.900180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-confd\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.900339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/298a4b48-6611-4cb4-8ccf-e9a00c23622b-erlang-cookie-secret\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.900486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-server-conf\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.900563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-erlang-cookie\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.900680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5zg6\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-kube-api-access-c5zg6\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.900755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.900909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-plugins-conf\") pod \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\" (UID: \"298a4b48-6611-4cb4-8ccf-e9a00c23622b\") " Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.901923 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.903062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.903839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.903933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.904015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/298a4b48-6611-4cb4-8ccf-e9a00c23622b-pod-info" (OuterVolumeSpecName: "pod-info") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.911780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/298a4b48-6611-4cb4-8ccf-e9a00c23622b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.920223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.931613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-config-data" (OuterVolumeSpecName: "config-data") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.945675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-kube-api-access-c5zg6" (OuterVolumeSpecName: "kube-api-access-c5zg6") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "kube-api-access-c5zg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:28 crc kubenswrapper[4707]: I0218 06:10:28.999721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-server-conf" (OuterVolumeSpecName: "server-conf") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003782 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/298a4b48-6611-4cb4-8ccf-e9a00c23622b-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003837 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003847 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003857 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/298a4b48-6611-4cb4-8ccf-e9a00c23622b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003868 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003877 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003885 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5zg6\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-kube-api-access-c5zg6\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003917 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.003927 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/298a4b48-6611-4cb4-8ccf-e9a00c23622b-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.037869 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.101680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "298a4b48-6611-4cb4-8ccf-e9a00c23622b" (UID: "298a4b48-6611-4cb4-8ccf-e9a00c23622b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.106630 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/298a4b48-6611-4cb4-8ccf-e9a00c23622b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.106677 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.658972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"298a4b48-6611-4cb4-8ccf-e9a00c23622b","Type":"ContainerDied","Data":"6f0f234a7ab4ef25717bc7a190e41071305ad93bd80646661d1136fb5827d3d7"} Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.659729 4707 scope.go:117] "RemoveContainer" containerID="fd972a2d3d4bd78031c578208b45afb2e0d2ec70227d127c35345986c0d1abd7" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.661320 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.667427 4707 generic.go:334] "Generic (PLEG): container finished" podID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerID="76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f" exitCode=0 Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.667639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8wwk" event={"ID":"7878b968-69b2-47a7-ba2c-de8d793eb24a","Type":"ContainerDied","Data":"76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f"} Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.745486 4707 scope.go:117] "RemoveContainer" containerID="cd5038c856e6b32cf4df33653ad2d71f042ff62ccb85562b6962da27b37269dd" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.749740 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.762508 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.773926 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:10:29 crc kubenswrapper[4707]: E0218 06:10:29.774360 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerName="setup-container" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.774381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerName="setup-container" Feb 18 06:10:29 crc kubenswrapper[4707]: E0218 06:10:29.774412 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerName="rabbitmq" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.774418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerName="rabbitmq" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.774600 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" containerName="rabbitmq" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.775902 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.778913 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.779095 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.779415 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.779889 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.780099 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.780966 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.782181 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kc457" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.820247 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.828099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.828356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.828597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b14ae66-3d41-476b-9ca7-2490e36de0aa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b14ae66-3d41-476b-9ca7-2490e36de0aa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2qb5\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-kube-api-access-r2qb5\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.829752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b14ae66-3d41-476b-9ca7-2490e36de0aa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b14ae66-3d41-476b-9ca7-2490e36de0aa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2qb5\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-kube-api-access-r2qb5\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.932986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.933011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.933047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.934195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.934291 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.935573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.938930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.956335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.959294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b14ae66-3d41-476b-9ca7-2490e36de0aa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.959406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.959653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.960325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b14ae66-3d41-476b-9ca7-2490e36de0aa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.965474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b14ae66-3d41-476b-9ca7-2490e36de0aa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:29 crc kubenswrapper[4707]: I0218 06:10:29.971630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2qb5\" (UniqueName: \"kubernetes.io/projected/7b14ae66-3d41-476b-9ca7-2490e36de0aa-kube-api-access-r2qb5\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.029898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"7b14ae66-3d41-476b-9ca7-2490e36de0aa\") " pod="openstack/rabbitmq-server-0" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.107770 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="298a4b48-6611-4cb4-8ccf-e9a00c23622b" path="/var/lib/kubelet/pods/298a4b48-6611-4cb4-8ccf-e9a00c23622b/volumes" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.215706 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.409906 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.462647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/536dddc2-0691-4171-98b1-1462ddf6b38a-erlang-cookie-secret\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.462693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-plugins\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.462864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-erlang-cookie\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.462893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-config-data\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.462914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwx2c\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-kube-api-access-gwx2c\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.462942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/536dddc2-0691-4171-98b1-1462ddf6b38a-pod-info\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.463020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.463109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-server-conf\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.463133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-confd\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.463169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-tls\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.463205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-plugins-conf\") pod \"536dddc2-0691-4171-98b1-1462ddf6b38a\" (UID: \"536dddc2-0691-4171-98b1-1462ddf6b38a\") " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.466537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.466613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.467198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.488498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536dddc2-0691-4171-98b1-1462ddf6b38a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.493922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.495448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-kube-api-access-gwx2c" (OuterVolumeSpecName: "kube-api-access-gwx2c") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "kube-api-access-gwx2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.497390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.502966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/536dddc2-0691-4171-98b1-1462ddf6b38a-pod-info" (OuterVolumeSpecName: "pod-info") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.574928 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.575372 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.575381 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/536dddc2-0691-4171-98b1-1462ddf6b38a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.575391 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.575399 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.575410 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwx2c\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-kube-api-access-gwx2c\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.575418 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/536dddc2-0691-4171-98b1-1462ddf6b38a-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.575439 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.582028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-config-data" (OuterVolumeSpecName: "config-data") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.632753 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.645125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-server-conf" (OuterVolumeSpecName: "server-conf") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.679351 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.685913 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/536dddc2-0691-4171-98b1-1462ddf6b38a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.685953 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.697598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8wwk" event={"ID":"7878b968-69b2-47a7-ba2c-de8d793eb24a","Type":"ContainerStarted","Data":"99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919"} Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.702977 4707 generic.go:334] "Generic (PLEG): container finished" podID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerID="2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf" exitCode=0 Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.703039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"536dddc2-0691-4171-98b1-1462ddf6b38a","Type":"ContainerDied","Data":"2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf"} Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.703072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"536dddc2-0691-4171-98b1-1462ddf6b38a","Type":"ContainerDied","Data":"af06aeec8837a106294b84c5c2d4d59123a901928ae74a2dabda32ca4d8247e1"} Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.703092 4707 scope.go:117] "RemoveContainer" containerID="2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.703242 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.722818 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "536dddc2-0691-4171-98b1-1462ddf6b38a" (UID: "536dddc2-0691-4171-98b1-1462ddf6b38a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.724224 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t8wwk" podStartSLOduration=2.931845901 podStartE2EDuration="5.724206008s" podCreationTimestamp="2026-02-18 06:10:25 +0000 UTC" firstStartedPulling="2026-02-18 06:10:27.580932619 +0000 UTC m=+1364.228891763" lastFinishedPulling="2026-02-18 06:10:30.373292736 +0000 UTC m=+1367.021251870" observedRunningTime="2026-02-18 06:10:30.7187008 +0000 UTC m=+1367.366659934" watchObservedRunningTime="2026-02-18 06:10:30.724206008 +0000 UTC m=+1367.372165142" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.766484 4707 scope.go:117] "RemoveContainer" containerID="c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.787989 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/536dddc2-0691-4171-98b1-1462ddf6b38a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.805177 4707 scope.go:117] "RemoveContainer" containerID="2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf" Feb 18 06:10:30 crc kubenswrapper[4707]: E0218 06:10:30.811837 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf\": container with ID starting with 2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf not found: ID does not exist" containerID="2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.812044 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf"} err="failed to get container status \"2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf\": rpc error: code = NotFound desc = could not find container \"2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf\": container with ID starting with 2a6163097a7118165dc46d6c6190999af6848a9aa49d9564784b80802cc17cbf not found: ID does not exist" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.812089 4707 scope.go:117] "RemoveContainer" containerID="c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b" Feb 18 06:10:30 crc kubenswrapper[4707]: E0218 06:10:30.813844 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b\": container with ID starting with c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b not found: ID does not exist" containerID="c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.813893 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b"} err="failed to get container status \"c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b\": rpc error: code = NotFound desc = could not find container \"c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b\": container with ID starting with c01a6047cda14066633b37aa8700b75d86a8103b3da940ffcb915ca5a290564b not found: ID does not exist" Feb 18 06:10:30 crc kubenswrapper[4707]: I0218 06:10:30.862336 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.053887 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.076352 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.091574 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:10:31 crc kubenswrapper[4707]: E0218 06:10:31.092230 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerName="setup-container" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.092253 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerName="setup-container" Feb 18 06:10:31 crc kubenswrapper[4707]: E0218 06:10:31.092269 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerName="rabbitmq" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.092276 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerName="rabbitmq" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.092525 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" containerName="rabbitmq" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.093879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.097357 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.097551 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.097820 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.098721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.101505 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.101877 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-j86kp" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.102020 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.114423 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.197304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.197369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.197411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.197698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.197870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.197929 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.198169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.198276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.198432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.198474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qsz\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-kube-api-access-v9qsz\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.198634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300612 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.300987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.301128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.301154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qsz\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-kube-api-access-v9qsz\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.301221 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.301865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.302256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.308753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.309238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.309537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.310622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.313164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.313629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.315670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.333247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qsz\" (UniqueName: \"kubernetes.io/projected/7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d-kube-api-access-v9qsz\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.338487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.431366 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:10:31 crc kubenswrapper[4707]: I0218 06:10:31.748485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b14ae66-3d41-476b-9ca7-2490e36de0aa","Type":"ContainerStarted","Data":"fc149eb6c19866ab0404c9708c2bb45746d3af36b437b177cd4ee46db69a4e06"} Feb 18 06:10:32 crc kubenswrapper[4707]: I0218 06:10:32.003861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 06:10:32 crc kubenswrapper[4707]: I0218 06:10:32.108757 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536dddc2-0691-4171-98b1-1462ddf6b38a" path="/var/lib/kubelet/pods/536dddc2-0691-4171-98b1-1462ddf6b38a/volumes" Feb 18 06:10:32 crc kubenswrapper[4707]: I0218 06:10:32.356777 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:32 crc kubenswrapper[4707]: I0218 06:10:32.356886 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:32 crc kubenswrapper[4707]: I0218 06:10:32.759266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b14ae66-3d41-476b-9ca7-2490e36de0aa","Type":"ContainerStarted","Data":"c7996143ceb46bed69f4aaac5e446ccdb54ba63fc4e84248afcd8dc8a8506f5e"} Feb 18 06:10:32 crc kubenswrapper[4707]: I0218 06:10:32.764333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d","Type":"ContainerStarted","Data":"8e1fc6c1d955c087155f167e286949710d3e1f0702453afa981b7f9923f34dc2"} Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.495746 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5f5jr" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="registry-server" probeResult="failure" output=< Feb 18 06:10:33 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 06:10:33 crc kubenswrapper[4707]: > Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.499366 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f8b8c57f5-jctj2"] Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.501378 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.504015 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.526195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f8b8c57f5-jctj2"] Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.586586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-svc\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.586673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-sb\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.586703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-nb\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.586796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-config\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.586883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m7cq\" (UniqueName: \"kubernetes.io/projected/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-kube-api-access-9m7cq\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.586971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-swift-storage-0\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.587029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.689077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.689142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-svc\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.689165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-sb\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.689186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-nb\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.689250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-config\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.689291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m7cq\" (UniqueName: \"kubernetes.io/projected/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-kube-api-access-9m7cq\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.689351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-swift-storage-0\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.690278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-sb\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.691052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-nb\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.691064 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-config\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.691148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-swift-storage-0\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.691576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-svc\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.692030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.713572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m7cq\" (UniqueName: \"kubernetes.io/projected/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-kube-api-access-9m7cq\") pod \"dnsmasq-dns-5f8b8c57f5-jctj2\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.774840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d","Type":"ContainerStarted","Data":"abb1fd7dd664287eb18f4110d533faeb114ff5abaed57e8afe94819e899a4e4f"} Feb 18 06:10:33 crc kubenswrapper[4707]: I0218 06:10:33.825846 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:34 crc kubenswrapper[4707]: W0218 06:10:34.316688 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6b5866e_7a21_47e4_be2d_7e5ce95b4f26.slice/crio-b7fc44e2c6c5839254cc6f32f90f5c353dcadf4ae304bbdcd053bb8c5ca8563a WatchSource:0}: Error finding container b7fc44e2c6c5839254cc6f32f90f5c353dcadf4ae304bbdcd053bb8c5ca8563a: Status 404 returned error can't find the container with id b7fc44e2c6c5839254cc6f32f90f5c353dcadf4ae304bbdcd053bb8c5ca8563a Feb 18 06:10:34 crc kubenswrapper[4707]: I0218 06:10:34.321129 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f8b8c57f5-jctj2"] Feb 18 06:10:34 crc kubenswrapper[4707]: I0218 06:10:34.792211 4707 generic.go:334] "Generic (PLEG): container finished" podID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerID="b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6" exitCode=0 Feb 18 06:10:34 crc kubenswrapper[4707]: I0218 06:10:34.792355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" event={"ID":"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26","Type":"ContainerDied","Data":"b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6"} Feb 18 06:10:34 crc kubenswrapper[4707]: I0218 06:10:34.792751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" event={"ID":"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26","Type":"ContainerStarted","Data":"b7fc44e2c6c5839254cc6f32f90f5c353dcadf4ae304bbdcd053bb8c5ca8563a"} Feb 18 06:10:35 crc kubenswrapper[4707]: I0218 06:10:35.806942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" event={"ID":"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26","Type":"ContainerStarted","Data":"71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109"} Feb 18 06:10:35 crc kubenswrapper[4707]: I0218 06:10:35.807761 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:35 crc kubenswrapper[4707]: I0218 06:10:35.838719 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" podStartSLOduration=2.838685435 podStartE2EDuration="2.838685435s" podCreationTimestamp="2026-02-18 06:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:10:35.828219303 +0000 UTC m=+1372.476178447" watchObservedRunningTime="2026-02-18 06:10:35.838685435 +0000 UTC m=+1372.486644569" Feb 18 06:10:36 crc kubenswrapper[4707]: I0218 06:10:36.135909 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:36 crc kubenswrapper[4707]: I0218 06:10:36.135967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:36 crc kubenswrapper[4707]: I0218 06:10:36.192233 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:36 crc kubenswrapper[4707]: I0218 06:10:36.864059 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:36 crc kubenswrapper[4707]: I0218 06:10:36.922217 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8wwk"] Feb 18 06:10:38 crc kubenswrapper[4707]: I0218 06:10:38.840699 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t8wwk" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="registry-server" containerID="cri-o://99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919" gracePeriod=2 Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.341131 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.439897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-catalog-content\") pod \"7878b968-69b2-47a7-ba2c-de8d793eb24a\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.439993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-utilities\") pod \"7878b968-69b2-47a7-ba2c-de8d793eb24a\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.440069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsndj\" (UniqueName: \"kubernetes.io/projected/7878b968-69b2-47a7-ba2c-de8d793eb24a-kube-api-access-dsndj\") pod \"7878b968-69b2-47a7-ba2c-de8d793eb24a\" (UID: \"7878b968-69b2-47a7-ba2c-de8d793eb24a\") " Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.441860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-utilities" (OuterVolumeSpecName: "utilities") pod "7878b968-69b2-47a7-ba2c-de8d793eb24a" (UID: "7878b968-69b2-47a7-ba2c-de8d793eb24a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.445889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7878b968-69b2-47a7-ba2c-de8d793eb24a-kube-api-access-dsndj" (OuterVolumeSpecName: "kube-api-access-dsndj") pod "7878b968-69b2-47a7-ba2c-de8d793eb24a" (UID: "7878b968-69b2-47a7-ba2c-de8d793eb24a"). InnerVolumeSpecName "kube-api-access-dsndj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.504464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7878b968-69b2-47a7-ba2c-de8d793eb24a" (UID: "7878b968-69b2-47a7-ba2c-de8d793eb24a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.544658 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.549049 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7878b968-69b2-47a7-ba2c-de8d793eb24a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.549082 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsndj\" (UniqueName: \"kubernetes.io/projected/7878b968-69b2-47a7-ba2c-de8d793eb24a-kube-api-access-dsndj\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.855632 4707 generic.go:334] "Generic (PLEG): container finished" podID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerID="99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919" exitCode=0 Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.855677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8wwk" event={"ID":"7878b968-69b2-47a7-ba2c-de8d793eb24a","Type":"ContainerDied","Data":"99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919"} Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.855706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8wwk" event={"ID":"7878b968-69b2-47a7-ba2c-de8d793eb24a","Type":"ContainerDied","Data":"7b74943c62e5e7264cf2a049255c5d38575fe675cb81936ad1404406a5fc3192"} Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.855725 4707 scope.go:117] "RemoveContainer" containerID="99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.855860 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8wwk" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.889332 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8wwk"] Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.892949 4707 scope.go:117] "RemoveContainer" containerID="76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.899513 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t8wwk"] Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.919108 4707 scope.go:117] "RemoveContainer" containerID="54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.970675 4707 scope.go:117] "RemoveContainer" containerID="99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919" Feb 18 06:10:39 crc kubenswrapper[4707]: E0218 06:10:39.971376 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919\": container with ID starting with 99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919 not found: ID does not exist" containerID="99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.971429 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919"} err="failed to get container status \"99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919\": rpc error: code = NotFound desc = could not find container \"99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919\": container with ID starting with 99d9aab4ebc1bea53d278670a55bd323f8d0552930f437f74702f8ae51021919 not found: ID does not exist" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.971459 4707 scope.go:117] "RemoveContainer" containerID="76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f" Feb 18 06:10:39 crc kubenswrapper[4707]: E0218 06:10:39.971934 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f\": container with ID starting with 76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f not found: ID does not exist" containerID="76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.971992 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f"} err="failed to get container status \"76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f\": rpc error: code = NotFound desc = could not find container \"76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f\": container with ID starting with 76fdf875958eba4e95996ef13b3c33db3dbc00944a6b79dbe7aa47ed75f1a14f not found: ID does not exist" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.972035 4707 scope.go:117] "RemoveContainer" containerID="54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6" Feb 18 06:10:39 crc kubenswrapper[4707]: E0218 06:10:39.972306 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6\": container with ID starting with 54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6 not found: ID does not exist" containerID="54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6" Feb 18 06:10:39 crc kubenswrapper[4707]: I0218 06:10:39.972339 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6"} err="failed to get container status \"54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6\": rpc error: code = NotFound desc = could not find container \"54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6\": container with ID starting with 54bf384c88df3cf955f1ac49b8d18537565941e238116a9c0a4995d26b1b0fe6 not found: ID does not exist" Feb 18 06:10:40 crc kubenswrapper[4707]: I0218 06:10:40.065263 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" path="/var/lib/kubelet/pods/7878b968-69b2-47a7-ba2c-de8d793eb24a/volumes" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.406309 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.462173 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ffpsr"] Feb 18 06:10:42 crc kubenswrapper[4707]: E0218 06:10:42.462713 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="extract-content" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.462733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="extract-content" Feb 18 06:10:42 crc kubenswrapper[4707]: E0218 06:10:42.462762 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="registry-server" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.462771 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="registry-server" Feb 18 06:10:42 crc kubenswrapper[4707]: E0218 06:10:42.462811 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="extract-utilities" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.462818 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="extract-utilities" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.463025 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7878b968-69b2-47a7-ba2c-de8d793eb24a" containerName="registry-server" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.464373 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.471275 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.480755 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ffpsr"] Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.511461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-utilities\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.511611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-catalog-content\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.511745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxktw\" (UniqueName: \"kubernetes.io/projected/42cbeba2-2af1-46f0-a401-22a6b8af0913-kube-api-access-qxktw\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.613402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-catalog-content\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.613550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxktw\" (UniqueName: \"kubernetes.io/projected/42cbeba2-2af1-46f0-a401-22a6b8af0913-kube-api-access-qxktw\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.613594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-utilities\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.614296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-catalog-content\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.614314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-utilities\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.633849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxktw\" (UniqueName: \"kubernetes.io/projected/42cbeba2-2af1-46f0-a401-22a6b8af0913-kube-api-access-qxktw\") pod \"certified-operators-ffpsr\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:42 crc kubenswrapper[4707]: I0218 06:10:42.784207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.313444 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ffpsr"] Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.429939 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f5jr"] Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.827675 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.881456 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9d8756bc-mqn6s"] Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.881707 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerName="dnsmasq-dns" containerID="cri-o://6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107" gracePeriod=10 Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.905371 4707 generic.go:334] "Generic (PLEG): container finished" podID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerID="3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a" exitCode=0 Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.905589 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5f5jr" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="registry-server" containerID="cri-o://9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1" gracePeriod=2 Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.906836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffpsr" event={"ID":"42cbeba2-2af1-46f0-a401-22a6b8af0913","Type":"ContainerDied","Data":"3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a"} Feb 18 06:10:43 crc kubenswrapper[4707]: I0218 06:10:43.906871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffpsr" event={"ID":"42cbeba2-2af1-46f0-a401-22a6b8af0913","Type":"ContainerStarted","Data":"cc48cadf50cd117cab58a8d7c7114023a55389c716bc8804751f0f7f1ec3077d"} Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.103456 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d474c7589-z56p2"] Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.105320 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.131105 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d474c7589-z56p2"] Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.213324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.213689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-ovsdbserver-sb\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.213722 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-ovsdbserver-nb\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.213745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-dns-swift-storage-0\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.213910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhjgx\" (UniqueName: \"kubernetes.io/projected/0341405c-1d6a-4750-b7c5-07ae9825d4b6-kube-api-access-lhjgx\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.214075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-dns-svc\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.214109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-config\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.316543 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-ovsdbserver-sb\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.316588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-ovsdbserver-nb\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.316607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-dns-swift-storage-0\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.316636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhjgx\" (UniqueName: \"kubernetes.io/projected/0341405c-1d6a-4750-b7c5-07ae9825d4b6-kube-api-access-lhjgx\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.316691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-dns-svc\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.316715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-config\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.316773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.317551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.318089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-ovsdbserver-sb\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.318569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-ovsdbserver-nb\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.321655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-dns-swift-storage-0\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.322527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-dns-svc\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.323054 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0341405c-1d6a-4750-b7c5-07ae9825d4b6-config\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.356973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhjgx\" (UniqueName: \"kubernetes.io/projected/0341405c-1d6a-4750-b7c5-07ae9825d4b6-kube-api-access-lhjgx\") pod \"dnsmasq-dns-6d474c7589-z56p2\" (UID: \"0341405c-1d6a-4750-b7c5-07ae9825d4b6\") " pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.437050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.635044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.670872 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.828994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-nb\") pod \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-config\") pod \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-utilities\") pod \"14f1fa8e-a809-4a16-a85d-338d18628a8a\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-catalog-content\") pod \"14f1fa8e-a809-4a16-a85d-338d18628a8a\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-sb\") pod \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj9qg\" (UniqueName: \"kubernetes.io/projected/14f1fa8e-a809-4a16-a85d-338d18628a8a-kube-api-access-qj9qg\") pod \"14f1fa8e-a809-4a16-a85d-338d18628a8a\" (UID: \"14f1fa8e-a809-4a16-a85d-338d18628a8a\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-swift-storage-0\") pod \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdpbr\" (UniqueName: \"kubernetes.io/projected/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-kube-api-access-qdpbr\") pod \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.829758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-svc\") pod \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\" (UID: \"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b\") " Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.849020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-utilities" (OuterVolumeSpecName: "utilities") pod "14f1fa8e-a809-4a16-a85d-338d18628a8a" (UID: "14f1fa8e-a809-4a16-a85d-338d18628a8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.861556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-kube-api-access-qdpbr" (OuterVolumeSpecName: "kube-api-access-qdpbr") pod "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" (UID: "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b"). InnerVolumeSpecName "kube-api-access-qdpbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.873172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f1fa8e-a809-4a16-a85d-338d18628a8a-kube-api-access-qj9qg" (OuterVolumeSpecName: "kube-api-access-qj9qg") pod "14f1fa8e-a809-4a16-a85d-338d18628a8a" (UID: "14f1fa8e-a809-4a16-a85d-338d18628a8a"). InnerVolumeSpecName "kube-api-access-qj9qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.899216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" (UID: "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.908519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" (UID: "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.938584 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.938629 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdpbr\" (UniqueName: \"kubernetes.io/projected/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-kube-api-access-qdpbr\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.938641 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.938654 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.938664 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj9qg\" (UniqueName: \"kubernetes.io/projected/14f1fa8e-a809-4a16-a85d-338d18628a8a-kube-api-access-qj9qg\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.953885 4707 generic.go:334] "Generic (PLEG): container finished" podID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerID="9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1" exitCode=0 Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.954115 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f5jr" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.954804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f5jr" event={"ID":"14f1fa8e-a809-4a16-a85d-338d18628a8a","Type":"ContainerDied","Data":"9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1"} Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.954861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f5jr" event={"ID":"14f1fa8e-a809-4a16-a85d-338d18628a8a","Type":"ContainerDied","Data":"da128c62e70eebeb4d053567db0c640238924015067f8e50cf8fa8b91cf365d8"} Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.954881 4707 scope.go:117] "RemoveContainer" containerID="9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.964282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffpsr" event={"ID":"42cbeba2-2af1-46f0-a401-22a6b8af0913","Type":"ContainerStarted","Data":"2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712"} Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.966176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-config" (OuterVolumeSpecName: "config") pod "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" (UID: "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.968878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" (UID: "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.969770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" (UID: "4f112a71-47e0-4a0d-ab3e-d047e77ecd6b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.971720 4707 generic.go:334] "Generic (PLEG): container finished" podID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerID="6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107" exitCode=0 Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.971898 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.971899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" event={"ID":"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b","Type":"ContainerDied","Data":"6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107"} Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.972994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" event={"ID":"4f112a71-47e0-4a0d-ab3e-d047e77ecd6b","Type":"ContainerDied","Data":"79bd3d0b96a0314fa0ecc1d675ff57b9e19b8b53fb59fc22aa092f1e50252b5f"} Feb 18 06:10:44 crc kubenswrapper[4707]: I0218 06:10:44.988601 4707 scope.go:117] "RemoveContainer" containerID="f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.018609 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9d8756bc-mqn6s"] Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.023397 4707 scope.go:117] "RemoveContainer" containerID="8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.026576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14f1fa8e-a809-4a16-a85d-338d18628a8a" (UID: "14f1fa8e-a809-4a16-a85d-338d18628a8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.028364 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9d8756bc-mqn6s"] Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.041755 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.041788 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.041890 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14f1fa8e-a809-4a16-a85d-338d18628a8a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.041900 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.050632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d474c7589-z56p2"] Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.303433 4707 scope.go:117] "RemoveContainer" containerID="9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1" Feb 18 06:10:45 crc kubenswrapper[4707]: E0218 06:10:45.304804 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1\": container with ID starting with 9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1 not found: ID does not exist" containerID="9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.304867 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1"} err="failed to get container status \"9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1\": rpc error: code = NotFound desc = could not find container \"9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1\": container with ID starting with 9e68d7b1b98f60f4ba7295c436f96798562840dd9fcd88d8d00f848943fb27c1 not found: ID does not exist" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.304905 4707 scope.go:117] "RemoveContainer" containerID="f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411" Feb 18 06:10:45 crc kubenswrapper[4707]: E0218 06:10:45.305258 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411\": container with ID starting with f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411 not found: ID does not exist" containerID="f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.305284 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411"} err="failed to get container status \"f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411\": rpc error: code = NotFound desc = could not find container \"f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411\": container with ID starting with f967f1c969baec3be7c36da96303bfcee290391417bc85758532cfd4de152411 not found: ID does not exist" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.305338 4707 scope.go:117] "RemoveContainer" containerID="8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d" Feb 18 06:10:45 crc kubenswrapper[4707]: E0218 06:10:45.305635 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d\": container with ID starting with 8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d not found: ID does not exist" containerID="8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.305663 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d"} err="failed to get container status \"8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d\": rpc error: code = NotFound desc = could not find container \"8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d\": container with ID starting with 8085158f4564b48793be67a5dd0cf6e6c89ab2ba397e2556520f8dea73e3e57d not found: ID does not exist" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.305705 4707 scope.go:117] "RemoveContainer" containerID="6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.353237 4707 scope.go:117] "RemoveContainer" containerID="e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.382894 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f5jr"] Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.391339 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5f5jr"] Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.392984 4707 scope.go:117] "RemoveContainer" containerID="6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107" Feb 18 06:10:45 crc kubenswrapper[4707]: E0218 06:10:45.393517 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107\": container with ID starting with 6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107 not found: ID does not exist" containerID="6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.393559 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107"} err="failed to get container status \"6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107\": rpc error: code = NotFound desc = could not find container \"6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107\": container with ID starting with 6d21ef636d9d335ab83fb398c7faabe6a6f831d593a85c10fc9faf952a55d107 not found: ID does not exist" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.393609 4707 scope.go:117] "RemoveContainer" containerID="e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604" Feb 18 06:10:45 crc kubenswrapper[4707]: E0218 06:10:45.394179 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604\": container with ID starting with e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604 not found: ID does not exist" containerID="e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.394331 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604"} err="failed to get container status \"e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604\": rpc error: code = NotFound desc = could not find container \"e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604\": container with ID starting with e4ab68e022ae83bb0146519efbc4fc83be4dc07d21f97da14da6a7f99c47a604 not found: ID does not exist" Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.984958 4707 generic.go:334] "Generic (PLEG): container finished" podID="0341405c-1d6a-4750-b7c5-07ae9825d4b6" containerID="7584f62b8c0964a3d8fa69f33a5c9c6ade7c1f67a9e307219202cf5cda309f97" exitCode=0 Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.985106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" event={"ID":"0341405c-1d6a-4750-b7c5-07ae9825d4b6","Type":"ContainerDied","Data":"7584f62b8c0964a3d8fa69f33a5c9c6ade7c1f67a9e307219202cf5cda309f97"} Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.985175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" event={"ID":"0341405c-1d6a-4750-b7c5-07ae9825d4b6","Type":"ContainerStarted","Data":"cec05ad2447c9cc64ad115c06b650b8bce8cfe66b07ac95aeac0d02fb92f2e50"} Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.988629 4707 generic.go:334] "Generic (PLEG): container finished" podID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerID="2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712" exitCode=0 Feb 18 06:10:45 crc kubenswrapper[4707]: I0218 06:10:45.988707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffpsr" event={"ID":"42cbeba2-2af1-46f0-a401-22a6b8af0913","Type":"ContainerDied","Data":"2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712"} Feb 18 06:10:46 crc kubenswrapper[4707]: I0218 06:10:46.072122 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" path="/var/lib/kubelet/pods/14f1fa8e-a809-4a16-a85d-338d18628a8a/volumes" Feb 18 06:10:46 crc kubenswrapper[4707]: I0218 06:10:46.073756 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" path="/var/lib/kubelet/pods/4f112a71-47e0-4a0d-ab3e-d047e77ecd6b/volumes" Feb 18 06:10:47 crc kubenswrapper[4707]: I0218 06:10:47.011432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" event={"ID":"0341405c-1d6a-4750-b7c5-07ae9825d4b6","Type":"ContainerStarted","Data":"27ce268d1ce45a8bcf8d2e77446edcd7bf45fac7e2754818ed62f088ddb1f04e"} Feb 18 06:10:47 crc kubenswrapper[4707]: I0218 06:10:47.011736 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:47 crc kubenswrapper[4707]: I0218 06:10:47.018837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffpsr" event={"ID":"42cbeba2-2af1-46f0-a401-22a6b8af0913","Type":"ContainerStarted","Data":"a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1"} Feb 18 06:10:47 crc kubenswrapper[4707]: I0218 06:10:47.044162 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" podStartSLOduration=3.044141634 podStartE2EDuration="3.044141634s" podCreationTimestamp="2026-02-18 06:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:10:47.032658804 +0000 UTC m=+1383.680617938" watchObservedRunningTime="2026-02-18 06:10:47.044141634 +0000 UTC m=+1383.692100768" Feb 18 06:10:47 crc kubenswrapper[4707]: I0218 06:10:47.052140 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ffpsr" podStartSLOduration=2.484143231 podStartE2EDuration="5.052123518s" podCreationTimestamp="2026-02-18 06:10:42 +0000 UTC" firstStartedPulling="2026-02-18 06:10:43.908665216 +0000 UTC m=+1380.556624350" lastFinishedPulling="2026-02-18 06:10:46.476645503 +0000 UTC m=+1383.124604637" observedRunningTime="2026-02-18 06:10:47.051396089 +0000 UTC m=+1383.699355223" watchObservedRunningTime="2026-02-18 06:10:47.052123518 +0000 UTC m=+1383.700082652" Feb 18 06:10:49 crc kubenswrapper[4707]: I0218 06:10:49.545544 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b9d8756bc-mqn6s" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.219:5353: i/o timeout" Feb 18 06:10:52 crc kubenswrapper[4707]: I0218 06:10:52.785897 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:52 crc kubenswrapper[4707]: I0218 06:10:52.786286 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:52 crc kubenswrapper[4707]: I0218 06:10:52.838369 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:53 crc kubenswrapper[4707]: I0218 06:10:53.118925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:53 crc kubenswrapper[4707]: I0218 06:10:53.210525 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ffpsr"] Feb 18 06:10:54 crc kubenswrapper[4707]: I0218 06:10:54.438747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d474c7589-z56p2" Feb 18 06:10:54 crc kubenswrapper[4707]: I0218 06:10:54.514895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f8b8c57f5-jctj2"] Feb 18 06:10:54 crc kubenswrapper[4707]: I0218 06:10:54.515143 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" podUID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerName="dnsmasq-dns" containerID="cri-o://71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109" gracePeriod=10 Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.047333 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.114461 4707 generic.go:334] "Generic (PLEG): container finished" podID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerID="71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109" exitCode=0 Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.114547 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.114571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" event={"ID":"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26","Type":"ContainerDied","Data":"71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109"} Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.114619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8b8c57f5-jctj2" event={"ID":"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26","Type":"ContainerDied","Data":"b7fc44e2c6c5839254cc6f32f90f5c353dcadf4ae304bbdcd053bb8c5ca8563a"} Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.114643 4707 scope.go:117] "RemoveContainer" containerID="71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.114929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ffpsr" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="registry-server" containerID="cri-o://a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1" gracePeriod=2 Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.140448 4707 scope.go:117] "RemoveContainer" containerID="b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.161569 4707 scope.go:117] "RemoveContainer" containerID="71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109" Feb 18 06:10:55 crc kubenswrapper[4707]: E0218 06:10:55.162013 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109\": container with ID starting with 71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109 not found: ID does not exist" containerID="71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.162056 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109"} err="failed to get container status \"71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109\": rpc error: code = NotFound desc = could not find container \"71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109\": container with ID starting with 71483b85bf04f2af6da31f13108d5ce0c28cef7167eb617c0212c58c12db1109 not found: ID does not exist" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.162085 4707 scope.go:117] "RemoveContainer" containerID="b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6" Feb 18 06:10:55 crc kubenswrapper[4707]: E0218 06:10:55.162337 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6\": container with ID starting with b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6 not found: ID does not exist" containerID="b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.162377 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6"} err="failed to get container status \"b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6\": rpc error: code = NotFound desc = could not find container \"b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6\": container with ID starting with b065a219692411e156b9ee07ecc7f0d00bddcdec1b9acb8d46acf767123e39d6 not found: ID does not exist" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.237914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-svc\") pod \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.238778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-swift-storage-0\") pod \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.238914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-nb\") pod \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.238989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-sb\") pod \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.239414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-config\") pod \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.239456 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-openstack-edpm-ipam\") pod \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.239504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m7cq\" (UniqueName: \"kubernetes.io/projected/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-kube-api-access-9m7cq\") pod \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\" (UID: \"a6b5866e-7a21-47e4-be2d-7e5ce95b4f26\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.244425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-kube-api-access-9m7cq" (OuterVolumeSpecName: "kube-api-access-9m7cq") pod "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" (UID: "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26"). InnerVolumeSpecName "kube-api-access-9m7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.299147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" (UID: "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.299152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" (UID: "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.300161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" (UID: "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.304137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" (UID: "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.306182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-config" (OuterVolumeSpecName: "config") pod "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" (UID: "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.315676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" (UID: "a6b5866e-7a21-47e4-be2d-7e5ce95b4f26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.342181 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.342272 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.342287 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.342302 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-config\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.342317 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.342328 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m7cq\" (UniqueName: \"kubernetes.io/projected/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-kube-api-access-9m7cq\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.342339 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.459291 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f8b8c57f5-jctj2"] Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.468631 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f8b8c57f5-jctj2"] Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.514730 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.545275 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxktw\" (UniqueName: \"kubernetes.io/projected/42cbeba2-2af1-46f0-a401-22a6b8af0913-kube-api-access-qxktw\") pod \"42cbeba2-2af1-46f0-a401-22a6b8af0913\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.545356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-utilities\") pod \"42cbeba2-2af1-46f0-a401-22a6b8af0913\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.545492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-catalog-content\") pod \"42cbeba2-2af1-46f0-a401-22a6b8af0913\" (UID: \"42cbeba2-2af1-46f0-a401-22a6b8af0913\") " Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.561775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-utilities" (OuterVolumeSpecName: "utilities") pod "42cbeba2-2af1-46f0-a401-22a6b8af0913" (UID: "42cbeba2-2af1-46f0-a401-22a6b8af0913"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.564302 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.590948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cbeba2-2af1-46f0-a401-22a6b8af0913-kube-api-access-qxktw" (OuterVolumeSpecName: "kube-api-access-qxktw") pod "42cbeba2-2af1-46f0-a401-22a6b8af0913" (UID: "42cbeba2-2af1-46f0-a401-22a6b8af0913"). InnerVolumeSpecName "kube-api-access-qxktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.666324 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxktw\" (UniqueName: \"kubernetes.io/projected/42cbeba2-2af1-46f0-a401-22a6b8af0913-kube-api-access-qxktw\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.754732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42cbeba2-2af1-46f0-a401-22a6b8af0913" (UID: "42cbeba2-2af1-46f0-a401-22a6b8af0913"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:10:55 crc kubenswrapper[4707]: I0218 06:10:55.768428 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbeba2-2af1-46f0-a401-22a6b8af0913-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.065419 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" path="/var/lib/kubelet/pods/a6b5866e-7a21-47e4-be2d-7e5ce95b4f26/volumes" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.127728 4707 generic.go:334] "Generic (PLEG): container finished" podID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerID="a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1" exitCode=0 Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.127776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffpsr" event={"ID":"42cbeba2-2af1-46f0-a401-22a6b8af0913","Type":"ContainerDied","Data":"a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1"} Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.127820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffpsr" event={"ID":"42cbeba2-2af1-46f0-a401-22a6b8af0913","Type":"ContainerDied","Data":"cc48cadf50cd117cab58a8d7c7114023a55389c716bc8804751f0f7f1ec3077d"} Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.127840 4707 scope.go:117] "RemoveContainer" containerID="a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.127857 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffpsr" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.151000 4707 scope.go:117] "RemoveContainer" containerID="2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.153961 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ffpsr"] Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.163479 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ffpsr"] Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.202166 4707 scope.go:117] "RemoveContainer" containerID="3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.225950 4707 scope.go:117] "RemoveContainer" containerID="a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1" Feb 18 06:10:56 crc kubenswrapper[4707]: E0218 06:10:56.226317 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1\": container with ID starting with a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1 not found: ID does not exist" containerID="a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.226345 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1"} err="failed to get container status \"a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1\": rpc error: code = NotFound desc = could not find container \"a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1\": container with ID starting with a0d4b480c3fd42fb435c27ea812d26a0a6d2def092b9912e8155a86d27d39ae1 not found: ID does not exist" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.226366 4707 scope.go:117] "RemoveContainer" containerID="2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712" Feb 18 06:10:56 crc kubenswrapper[4707]: E0218 06:10:56.226549 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712\": container with ID starting with 2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712 not found: ID does not exist" containerID="2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.226572 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712"} err="failed to get container status \"2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712\": rpc error: code = NotFound desc = could not find container \"2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712\": container with ID starting with 2f075d1292ae4752d600d7d85a1cefeb026de3e2e50f0e12c64b179e9fb81712 not found: ID does not exist" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.226585 4707 scope.go:117] "RemoveContainer" containerID="3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a" Feb 18 06:10:56 crc kubenswrapper[4707]: E0218 06:10:56.226833 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a\": container with ID starting with 3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a not found: ID does not exist" containerID="3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a" Feb 18 06:10:56 crc kubenswrapper[4707]: I0218 06:10:56.226860 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a"} err="failed to get container status \"3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a\": rpc error: code = NotFound desc = could not find container \"3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a\": container with ID starting with 3e22241bb19ad372e6d053db4f25863041c63528ed326ac3f91ab39ab7cf7d0a not found: ID does not exist" Feb 18 06:10:58 crc kubenswrapper[4707]: I0218 06:10:58.063646 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" path="/var/lib/kubelet/pods/42cbeba2-2af1-46f0-a401-22a6b8af0913/volumes" Feb 18 06:11:05 crc kubenswrapper[4707]: I0218 06:11:05.211284 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b14ae66-3d41-476b-9ca7-2490e36de0aa" containerID="c7996143ceb46bed69f4aaac5e446ccdb54ba63fc4e84248afcd8dc8a8506f5e" exitCode=0 Feb 18 06:11:05 crc kubenswrapper[4707]: I0218 06:11:05.211395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b14ae66-3d41-476b-9ca7-2490e36de0aa","Type":"ContainerDied","Data":"c7996143ceb46bed69f4aaac5e446ccdb54ba63fc4e84248afcd8dc8a8506f5e"} Feb 18 06:11:06 crc kubenswrapper[4707]: I0218 06:11:06.222357 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d" containerID="abb1fd7dd664287eb18f4110d533faeb114ff5abaed57e8afe94819e899a4e4f" exitCode=0 Feb 18 06:11:06 crc kubenswrapper[4707]: I0218 06:11:06.222504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d","Type":"ContainerDied","Data":"abb1fd7dd664287eb18f4110d533faeb114ff5abaed57e8afe94819e899a4e4f"} Feb 18 06:11:06 crc kubenswrapper[4707]: I0218 06:11:06.226527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b14ae66-3d41-476b-9ca7-2490e36de0aa","Type":"ContainerStarted","Data":"8d440c278829011ec9cc179277a1bd7a7fa0d9efde3572a0f1f59ad6e6a7af94"} Feb 18 06:11:06 crc kubenswrapper[4707]: I0218 06:11:06.226965 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 06:11:06 crc kubenswrapper[4707]: I0218 06:11:06.271997 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.27198309 podStartE2EDuration="37.27198309s" podCreationTimestamp="2026-02-18 06:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:11:06.27052355 +0000 UTC m=+1402.918482684" watchObservedRunningTime="2026-02-18 06:11:06.27198309 +0000 UTC m=+1402.919942214" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.244627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d","Type":"ContainerStarted","Data":"765c9de0264f5dbeee0030d307d8ae69e4d5ef380360bf521c5f3879888c00d3"} Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.245277 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.272531 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.272511912 podStartE2EDuration="36.272511912s" podCreationTimestamp="2026-02-18 06:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:11:07.270392994 +0000 UTC m=+1403.918352128" watchObservedRunningTime="2026-02-18 06:11:07.272511912 +0000 UTC m=+1403.920471046" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.732630 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t"] Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerName="dnsmasq-dns" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733195 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerName="dnsmasq-dns" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733216 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="extract-content" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733224 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="extract-content" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733264 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerName="init" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733272 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerName="init" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733289 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="registry-server" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733296 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="registry-server" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733318 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="extract-utilities" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="extract-utilities" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733339 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerName="init" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerName="init" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733363 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="extract-utilities" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733371 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="extract-utilities" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733385 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerName="dnsmasq-dns" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733392 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerName="dnsmasq-dns" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733400 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="extract-content" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733407 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="extract-content" Feb 18 06:11:07 crc kubenswrapper[4707]: E0218 06:11:07.733430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="registry-server" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733438 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="registry-server" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733678 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b5866e-7a21-47e4-be2d-7e5ce95b4f26" containerName="dnsmasq-dns" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733713 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14f1fa8e-a809-4a16-a85d-338d18628a8a" containerName="registry-server" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733729 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f112a71-47e0-4a0d-ab3e-d047e77ecd6b" containerName="dnsmasq-dns" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.733746 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cbeba2-2af1-46f0-a401-22a6b8af0913" containerName="registry-server" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.734483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.736744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.741429 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.742498 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.742603 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.764387 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t"] Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.902119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.902173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snkc\" (UniqueName: \"kubernetes.io/projected/4ccb192f-200d-453b-8829-3cdaddb0987b-kube-api-access-7snkc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.902207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:07 crc kubenswrapper[4707]: I0218 06:11:07.902446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.004185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.004337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.004367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snkc\" (UniqueName: \"kubernetes.io/projected/4ccb192f-200d-453b-8829-3cdaddb0987b-kube-api-access-7snkc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.004396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.010404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.013009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.022405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.024040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snkc\" (UniqueName: \"kubernetes.io/projected/4ccb192f-200d-453b-8829-3cdaddb0987b-kube-api-access-7snkc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.069789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:08 crc kubenswrapper[4707]: I0218 06:11:08.599364 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t"] Feb 18 06:11:09 crc kubenswrapper[4707]: I0218 06:11:09.287131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" event={"ID":"4ccb192f-200d-453b-8829-3cdaddb0987b","Type":"ContainerStarted","Data":"c5f8c7d09b8a34fc2c22081a1584de3edaa7760a6df5cc530e8b21f65da33e81"} Feb 18 06:11:20 crc kubenswrapper[4707]: I0218 06:11:20.218982 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 06:11:21 crc kubenswrapper[4707]: I0218 06:11:21.392738 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:11:21 crc kubenswrapper[4707]: I0218 06:11:21.393306 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:11:21 crc kubenswrapper[4707]: I0218 06:11:21.434100 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 06:11:22 crc kubenswrapper[4707]: I0218 06:11:22.549068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" event={"ID":"4ccb192f-200d-453b-8829-3cdaddb0987b","Type":"ContainerStarted","Data":"a3ae8c4d10b5ad0a2fb75d3a4abfd47bcf290a2ed0112403b917c45f92a0a542"} Feb 18 06:11:22 crc kubenswrapper[4707]: I0218 06:11:22.563352 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" podStartSLOduration=2.574556234 podStartE2EDuration="15.563330577s" podCreationTimestamp="2026-02-18 06:11:07 +0000 UTC" firstStartedPulling="2026-02-18 06:11:08.623782681 +0000 UTC m=+1405.271741815" lastFinishedPulling="2026-02-18 06:11:21.612557024 +0000 UTC m=+1418.260516158" observedRunningTime="2026-02-18 06:11:22.559954116 +0000 UTC m=+1419.207913250" watchObservedRunningTime="2026-02-18 06:11:22.563330577 +0000 UTC m=+1419.211289711" Feb 18 06:11:32 crc kubenswrapper[4707]: I0218 06:11:32.633202 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ccb192f-200d-453b-8829-3cdaddb0987b" containerID="a3ae8c4d10b5ad0a2fb75d3a4abfd47bcf290a2ed0112403b917c45f92a0a542" exitCode=0 Feb 18 06:11:32 crc kubenswrapper[4707]: I0218 06:11:32.633314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" event={"ID":"4ccb192f-200d-453b-8829-3cdaddb0987b","Type":"ContainerDied","Data":"a3ae8c4d10b5ad0a2fb75d3a4abfd47bcf290a2ed0112403b917c45f92a0a542"} Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.120427 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.225734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snkc\" (UniqueName: \"kubernetes.io/projected/4ccb192f-200d-453b-8829-3cdaddb0987b-kube-api-access-7snkc\") pod \"4ccb192f-200d-453b-8829-3cdaddb0987b\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.225892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-ssh-key-openstack-edpm-ipam\") pod \"4ccb192f-200d-453b-8829-3cdaddb0987b\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.226065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-inventory\") pod \"4ccb192f-200d-453b-8829-3cdaddb0987b\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.226125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-repo-setup-combined-ca-bundle\") pod \"4ccb192f-200d-453b-8829-3cdaddb0987b\" (UID: \"4ccb192f-200d-453b-8829-3cdaddb0987b\") " Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.231976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ccb192f-200d-453b-8829-3cdaddb0987b-kube-api-access-7snkc" (OuterVolumeSpecName: "kube-api-access-7snkc") pod "4ccb192f-200d-453b-8829-3cdaddb0987b" (UID: "4ccb192f-200d-453b-8829-3cdaddb0987b"). InnerVolumeSpecName "kube-api-access-7snkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.232688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4ccb192f-200d-453b-8829-3cdaddb0987b" (UID: "4ccb192f-200d-453b-8829-3cdaddb0987b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.257288 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-inventory" (OuterVolumeSpecName: "inventory") pod "4ccb192f-200d-453b-8829-3cdaddb0987b" (UID: "4ccb192f-200d-453b-8829-3cdaddb0987b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.258648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ccb192f-200d-453b-8829-3cdaddb0987b" (UID: "4ccb192f-200d-453b-8829-3cdaddb0987b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.327864 4707 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.327998 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snkc\" (UniqueName: \"kubernetes.io/projected/4ccb192f-200d-453b-8829-3cdaddb0987b-kube-api-access-7snkc\") on node \"crc\" DevicePath \"\"" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.328054 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.328111 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ccb192f-200d-453b-8829-3cdaddb0987b-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.651877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" event={"ID":"4ccb192f-200d-453b-8829-3cdaddb0987b","Type":"ContainerDied","Data":"c5f8c7d09b8a34fc2c22081a1584de3edaa7760a6df5cc530e8b21f65da33e81"} Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.652213 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f8c7d09b8a34fc2c22081a1584de3edaa7760a6df5cc530e8b21f65da33e81" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.651941 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.728474 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q"] Feb 18 06:11:34 crc kubenswrapper[4707]: E0218 06:11:34.728983 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ccb192f-200d-453b-8829-3cdaddb0987b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.729013 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ccb192f-200d-453b-8829-3cdaddb0987b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.729295 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ccb192f-200d-453b-8829-3cdaddb0987b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.730098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.736149 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.736380 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.736949 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.737095 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.745418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q"] Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.839815 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6cfb\" (UniqueName: \"kubernetes.io/projected/668c00e7-edea-47b0-a904-961fb756cb1d-kube-api-access-q6cfb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.839956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.840006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.941826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6cfb\" (UniqueName: \"kubernetes.io/projected/668c00e7-edea-47b0-a904-961fb756cb1d-kube-api-access-q6cfb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.941873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.941911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.946089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.946144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:34 crc kubenswrapper[4707]: I0218 06:11:34.961276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6cfb\" (UniqueName: \"kubernetes.io/projected/668c00e7-edea-47b0-a904-961fb756cb1d-kube-api-access-q6cfb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4hc8q\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:35 crc kubenswrapper[4707]: I0218 06:11:35.071611 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:35 crc kubenswrapper[4707]: I0218 06:11:35.613117 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q"] Feb 18 06:11:35 crc kubenswrapper[4707]: I0218 06:11:35.622842 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:11:35 crc kubenswrapper[4707]: I0218 06:11:35.671232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" event={"ID":"668c00e7-edea-47b0-a904-961fb756cb1d","Type":"ContainerStarted","Data":"b5fa738ed08bc11adffd8976dc77edcda6ea1229194ae154cbccbd8c40f4bdc5"} Feb 18 06:11:36 crc kubenswrapper[4707]: I0218 06:11:36.686478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" event={"ID":"668c00e7-edea-47b0-a904-961fb756cb1d","Type":"ContainerStarted","Data":"b6a70cf0a6246704041ce3f27b4c6a7b820e05bf309c29a71e9860c74e829b70"} Feb 18 06:11:36 crc kubenswrapper[4707]: I0218 06:11:36.707366 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" podStartSLOduration=2.277114681 podStartE2EDuration="2.707343575s" podCreationTimestamp="2026-02-18 06:11:34 +0000 UTC" firstStartedPulling="2026-02-18 06:11:35.622556032 +0000 UTC m=+1432.270515166" lastFinishedPulling="2026-02-18 06:11:36.052784926 +0000 UTC m=+1432.700744060" observedRunningTime="2026-02-18 06:11:36.705164416 +0000 UTC m=+1433.353123580" watchObservedRunningTime="2026-02-18 06:11:36.707343575 +0000 UTC m=+1433.355302709" Feb 18 06:11:39 crc kubenswrapper[4707]: I0218 06:11:39.726772 4707 generic.go:334] "Generic (PLEG): container finished" podID="668c00e7-edea-47b0-a904-961fb756cb1d" containerID="b6a70cf0a6246704041ce3f27b4c6a7b820e05bf309c29a71e9860c74e829b70" exitCode=0 Feb 18 06:11:39 crc kubenswrapper[4707]: I0218 06:11:39.726852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" event={"ID":"668c00e7-edea-47b0-a904-961fb756cb1d","Type":"ContainerDied","Data":"b6a70cf0a6246704041ce3f27b4c6a7b820e05bf309c29a71e9860c74e829b70"} Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.195722 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.381170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-ssh-key-openstack-edpm-ipam\") pod \"668c00e7-edea-47b0-a904-961fb756cb1d\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.381352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6cfb\" (UniqueName: \"kubernetes.io/projected/668c00e7-edea-47b0-a904-961fb756cb1d-kube-api-access-q6cfb\") pod \"668c00e7-edea-47b0-a904-961fb756cb1d\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.381404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-inventory\") pod \"668c00e7-edea-47b0-a904-961fb756cb1d\" (UID: \"668c00e7-edea-47b0-a904-961fb756cb1d\") " Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.386420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668c00e7-edea-47b0-a904-961fb756cb1d-kube-api-access-q6cfb" (OuterVolumeSpecName: "kube-api-access-q6cfb") pod "668c00e7-edea-47b0-a904-961fb756cb1d" (UID: "668c00e7-edea-47b0-a904-961fb756cb1d"). InnerVolumeSpecName "kube-api-access-q6cfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.407776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "668c00e7-edea-47b0-a904-961fb756cb1d" (UID: "668c00e7-edea-47b0-a904-961fb756cb1d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.417086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-inventory" (OuterVolumeSpecName: "inventory") pod "668c00e7-edea-47b0-a904-961fb756cb1d" (UID: "668c00e7-edea-47b0-a904-961fb756cb1d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.484050 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.484097 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6cfb\" (UniqueName: \"kubernetes.io/projected/668c00e7-edea-47b0-a904-961fb756cb1d-kube-api-access-q6cfb\") on node \"crc\" DevicePath \"\"" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.484112 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668c00e7-edea-47b0-a904-961fb756cb1d-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.748814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" event={"ID":"668c00e7-edea-47b0-a904-961fb756cb1d","Type":"ContainerDied","Data":"b5fa738ed08bc11adffd8976dc77edcda6ea1229194ae154cbccbd8c40f4bdc5"} Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.748860 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5fa738ed08bc11adffd8976dc77edcda6ea1229194ae154cbccbd8c40f4bdc5" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.748918 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4hc8q" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.838161 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68"] Feb 18 06:11:41 crc kubenswrapper[4707]: E0218 06:11:41.839056 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668c00e7-edea-47b0-a904-961fb756cb1d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.839079 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="668c00e7-edea-47b0-a904-961fb756cb1d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.839636 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="668c00e7-edea-47b0-a904-961fb756cb1d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.840791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.845614 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.845647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.846012 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.846289 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.854206 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68"] Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.994865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.995838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.995897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:41 crc kubenswrapper[4707]: I0218 06:11:41.996049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbgv8\" (UniqueName: \"kubernetes.io/projected/3c7dd778-4759-4515-bbf0-bbc5123e822f-kube-api-access-mbgv8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.098557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbgv8\" (UniqueName: \"kubernetes.io/projected/3c7dd778-4759-4515-bbf0-bbc5123e822f-kube-api-access-mbgv8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.099063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.099136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.099162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.104344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.108228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.108492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.114638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbgv8\" (UniqueName: \"kubernetes.io/projected/3c7dd778-4759-4515-bbf0-bbc5123e822f-kube-api-access-mbgv8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-24v68\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.197611 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:11:42 crc kubenswrapper[4707]: I0218 06:11:42.788543 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68"] Feb 18 06:11:43 crc kubenswrapper[4707]: I0218 06:11:43.767926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" event={"ID":"3c7dd778-4759-4515-bbf0-bbc5123e822f","Type":"ContainerStarted","Data":"6251a35b91d3add6a878cad77e04221e91224294791760a48475cebe9c24ee98"} Feb 18 06:11:43 crc kubenswrapper[4707]: I0218 06:11:43.768282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" event={"ID":"3c7dd778-4759-4515-bbf0-bbc5123e822f","Type":"ContainerStarted","Data":"a70a4f63001a4e47392992264bf01a23489e208c61c5ac0e7e9ebf45a0909a32"} Feb 18 06:11:43 crc kubenswrapper[4707]: I0218 06:11:43.788906 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" podStartSLOduration=2.383465887 podStartE2EDuration="2.788884023s" podCreationTimestamp="2026-02-18 06:11:41 +0000 UTC" firstStartedPulling="2026-02-18 06:11:42.799706332 +0000 UTC m=+1439.447665476" lastFinishedPulling="2026-02-18 06:11:43.205124468 +0000 UTC m=+1439.853083612" observedRunningTime="2026-02-18 06:11:43.784517495 +0000 UTC m=+1440.432476629" watchObservedRunningTime="2026-02-18 06:11:43.788884023 +0000 UTC m=+1440.436843147" Feb 18 06:11:47 crc kubenswrapper[4707]: I0218 06:11:47.174200 4707 scope.go:117] "RemoveContainer" containerID="2d923c60f672c03514692cd1e760948e35d5cd6c66436215cf7fd41b05ae775e" Feb 18 06:11:47 crc kubenswrapper[4707]: I0218 06:11:47.195819 4707 scope.go:117] "RemoveContainer" containerID="cb0b2b6d9acbce7710bd04e0bc0ac9801ca77a14b5298962d0f7e26510e708fb" Feb 18 06:11:47 crc kubenswrapper[4707]: I0218 06:11:47.241786 4707 scope.go:117] "RemoveContainer" containerID="859f719120d27b68300c803790fadce72cbf939390fe2b89fedc2d85f77a04b8" Feb 18 06:11:51 crc kubenswrapper[4707]: I0218 06:11:51.382196 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:11:51 crc kubenswrapper[4707]: I0218 06:11:51.382881 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:12:21 crc kubenswrapper[4707]: I0218 06:12:21.382454 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:12:21 crc kubenswrapper[4707]: I0218 06:12:21.383130 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:12:21 crc kubenswrapper[4707]: I0218 06:12:21.383346 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:12:21 crc kubenswrapper[4707]: I0218 06:12:21.384734 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:12:21 crc kubenswrapper[4707]: I0218 06:12:21.384928 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" gracePeriod=600 Feb 18 06:12:21 crc kubenswrapper[4707]: E0218 06:12:21.516146 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:12:22 crc kubenswrapper[4707]: I0218 06:12:22.179270 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" exitCode=0 Feb 18 06:12:22 crc kubenswrapper[4707]: I0218 06:12:22.179312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af"} Feb 18 06:12:22 crc kubenswrapper[4707]: I0218 06:12:22.179373 4707 scope.go:117] "RemoveContainer" containerID="3eb8d09ea3950a1c29c70e73d11ea5133c61c40a9512fdef46057924b3898430" Feb 18 06:12:22 crc kubenswrapper[4707]: I0218 06:12:22.180342 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:12:22 crc kubenswrapper[4707]: E0218 06:12:22.180737 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:12:36 crc kubenswrapper[4707]: I0218 06:12:36.053874 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:12:36 crc kubenswrapper[4707]: E0218 06:12:36.054862 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:12:47 crc kubenswrapper[4707]: I0218 06:12:47.425202 4707 scope.go:117] "RemoveContainer" containerID="51a6b225dae50b85cc67ae1341f9aa5bed51407265f983bd797477bf072586d4" Feb 18 06:12:47 crc kubenswrapper[4707]: I0218 06:12:47.451978 4707 scope.go:117] "RemoveContainer" containerID="dd39673696c1c71348820e2f471078d3a0a78f235d0d501e6988b7a84c1af716" Feb 18 06:12:47 crc kubenswrapper[4707]: I0218 06:12:47.477399 4707 scope.go:117] "RemoveContainer" containerID="bc680cffb8b9228f3585dc8539a3c1fc64b473ab07f48a5cedeeb4beabf17e79" Feb 18 06:12:47 crc kubenswrapper[4707]: I0218 06:12:47.505611 4707 scope.go:117] "RemoveContainer" containerID="9a927c7e6a1dd384f92770bbefa0c1a08da8a90b861fac8757fdf67122c93d2a" Feb 18 06:12:47 crc kubenswrapper[4707]: I0218 06:12:47.534872 4707 scope.go:117] "RemoveContainer" containerID="21a1e13e92cf035a710054e2d9234fe71fc6780b376905ee70fa92424ddcec77" Feb 18 06:12:47 crc kubenswrapper[4707]: I0218 06:12:47.562934 4707 scope.go:117] "RemoveContainer" containerID="cc9fb961bf97a03dc9897ee5392ff906f7ab810a5a5158fb6791e7be27165dd1" Feb 18 06:12:47 crc kubenswrapper[4707]: I0218 06:12:47.582915 4707 scope.go:117] "RemoveContainer" containerID="3a273d301b95ba0c04b5d4dfd0516aead8108ca6155655dfd482f7d65bf1f1c6" Feb 18 06:12:48 crc kubenswrapper[4707]: I0218 06:12:48.053117 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:12:48 crc kubenswrapper[4707]: E0218 06:12:48.053750 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:13:03 crc kubenswrapper[4707]: I0218 06:13:03.053980 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:13:03 crc kubenswrapper[4707]: E0218 06:13:03.054784 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:13:18 crc kubenswrapper[4707]: I0218 06:13:18.053534 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:13:18 crc kubenswrapper[4707]: E0218 06:13:18.055241 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:13:30 crc kubenswrapper[4707]: I0218 06:13:30.053656 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:13:30 crc kubenswrapper[4707]: E0218 06:13:30.054638 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:13:41 crc kubenswrapper[4707]: I0218 06:13:41.054134 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:13:41 crc kubenswrapper[4707]: E0218 06:13:41.055524 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:13:55 crc kubenswrapper[4707]: I0218 06:13:55.054776 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:13:55 crc kubenswrapper[4707]: E0218 06:13:55.059333 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:14:08 crc kubenswrapper[4707]: I0218 06:14:08.055404 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:14:08 crc kubenswrapper[4707]: E0218 06:14:08.056205 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:14:21 crc kubenswrapper[4707]: I0218 06:14:21.053350 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:14:21 crc kubenswrapper[4707]: E0218 06:14:21.054122 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:14:32 crc kubenswrapper[4707]: I0218 06:14:32.053217 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:14:32 crc kubenswrapper[4707]: E0218 06:14:32.054131 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:14:34 crc kubenswrapper[4707]: I0218 06:14:34.591193 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c7dd778-4759-4515-bbf0-bbc5123e822f" containerID="6251a35b91d3add6a878cad77e04221e91224294791760a48475cebe9c24ee98" exitCode=0 Feb 18 06:14:34 crc kubenswrapper[4707]: I0218 06:14:34.591294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" event={"ID":"3c7dd778-4759-4515-bbf0-bbc5123e822f","Type":"ContainerDied","Data":"6251a35b91d3add6a878cad77e04221e91224294791760a48475cebe9c24ee98"} Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.015156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.068717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbgv8\" (UniqueName: \"kubernetes.io/projected/3c7dd778-4759-4515-bbf0-bbc5123e822f-kube-api-access-mbgv8\") pod \"3c7dd778-4759-4515-bbf0-bbc5123e822f\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.068809 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-bootstrap-combined-ca-bundle\") pod \"3c7dd778-4759-4515-bbf0-bbc5123e822f\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.068841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-ssh-key-openstack-edpm-ipam\") pod \"3c7dd778-4759-4515-bbf0-bbc5123e822f\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.068895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-inventory\") pod \"3c7dd778-4759-4515-bbf0-bbc5123e822f\" (UID: \"3c7dd778-4759-4515-bbf0-bbc5123e822f\") " Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.075239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3c7dd778-4759-4515-bbf0-bbc5123e822f" (UID: "3c7dd778-4759-4515-bbf0-bbc5123e822f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.086176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c7dd778-4759-4515-bbf0-bbc5123e822f-kube-api-access-mbgv8" (OuterVolumeSpecName: "kube-api-access-mbgv8") pod "3c7dd778-4759-4515-bbf0-bbc5123e822f" (UID: "3c7dd778-4759-4515-bbf0-bbc5123e822f"). InnerVolumeSpecName "kube-api-access-mbgv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.096134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3c7dd778-4759-4515-bbf0-bbc5123e822f" (UID: "3c7dd778-4759-4515-bbf0-bbc5123e822f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.101458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-inventory" (OuterVolumeSpecName: "inventory") pod "3c7dd778-4759-4515-bbf0-bbc5123e822f" (UID: "3c7dd778-4759-4515-bbf0-bbc5123e822f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.172328 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbgv8\" (UniqueName: \"kubernetes.io/projected/3c7dd778-4759-4515-bbf0-bbc5123e822f-kube-api-access-mbgv8\") on node \"crc\" DevicePath \"\"" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.172374 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.172389 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.172402 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c7dd778-4759-4515-bbf0-bbc5123e822f-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.612892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" event={"ID":"3c7dd778-4759-4515-bbf0-bbc5123e822f","Type":"ContainerDied","Data":"a70a4f63001a4e47392992264bf01a23489e208c61c5ac0e7e9ebf45a0909a32"} Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.612936 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-24v68" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.612946 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70a4f63001a4e47392992264bf01a23489e208c61c5ac0e7e9ebf45a0909a32" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.685163 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45"] Feb 18 06:14:36 crc kubenswrapper[4707]: E0218 06:14:36.685554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c7dd778-4759-4515-bbf0-bbc5123e822f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.685570 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7dd778-4759-4515-bbf0-bbc5123e822f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.685740 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c7dd778-4759-4515-bbf0-bbc5123e822f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.686354 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.689124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.692418 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.692735 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.693148 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.695048 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45"] Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.781380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.781472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7668p\" (UniqueName: \"kubernetes.io/projected/d7bc2edd-9db2-40df-be54-0db1c1b462fa-kube-api-access-7668p\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.781548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.883938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.884265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7668p\" (UniqueName: \"kubernetes.io/projected/d7bc2edd-9db2-40df-be54-0db1c1b462fa-kube-api-access-7668p\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.884409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.887616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.888190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:36 crc kubenswrapper[4707]: I0218 06:14:36.901575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7668p\" (UniqueName: \"kubernetes.io/projected/d7bc2edd-9db2-40df-be54-0db1c1b462fa-kube-api-access-7668p\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xtr45\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:37 crc kubenswrapper[4707]: I0218 06:14:37.008329 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:14:37 crc kubenswrapper[4707]: I0218 06:14:37.538534 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45"] Feb 18 06:14:37 crc kubenswrapper[4707]: I0218 06:14:37.621992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" event={"ID":"d7bc2edd-9db2-40df-be54-0db1c1b462fa","Type":"ContainerStarted","Data":"d0a788f7b61936e4dd747f0ab3a41d619c45ee4f8d00a35e97279b573885f60b"} Feb 18 06:14:38 crc kubenswrapper[4707]: I0218 06:14:38.639579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" event={"ID":"d7bc2edd-9db2-40df-be54-0db1c1b462fa","Type":"ContainerStarted","Data":"9a34633e085836198f88ad85199c289849d3c4ad0f9b0062daf12e8ffa39d69d"} Feb 18 06:14:45 crc kubenswrapper[4707]: I0218 06:14:45.053363 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:14:45 crc kubenswrapper[4707]: E0218 06:14:45.054076 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.053995 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:15:00 crc kubenswrapper[4707]: E0218 06:15:00.054723 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.145236 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" podStartSLOduration=23.721013304 podStartE2EDuration="24.145213959s" podCreationTimestamp="2026-02-18 06:14:36 +0000 UTC" firstStartedPulling="2026-02-18 06:14:37.565266579 +0000 UTC m=+1614.213225713" lastFinishedPulling="2026-02-18 06:14:37.989467234 +0000 UTC m=+1614.637426368" observedRunningTime="2026-02-18 06:14:38.666482952 +0000 UTC m=+1615.314442126" watchObservedRunningTime="2026-02-18 06:15:00.145213959 +0000 UTC m=+1636.793173103" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.152988 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n"] Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.154419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.157003 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.157044 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.169657 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n"] Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.315986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8019647-1774-483b-b11e-b478b894487f-config-volume\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.316074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xz2n\" (UniqueName: \"kubernetes.io/projected/e8019647-1774-483b-b11e-b478b894487f-kube-api-access-4xz2n\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.316282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8019647-1774-483b-b11e-b478b894487f-secret-volume\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.418667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8019647-1774-483b-b11e-b478b894487f-config-volume\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.418742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xz2n\" (UniqueName: \"kubernetes.io/projected/e8019647-1774-483b-b11e-b478b894487f-kube-api-access-4xz2n\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.418770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8019647-1774-483b-b11e-b478b894487f-secret-volume\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.419626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8019647-1774-483b-b11e-b478b894487f-config-volume\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.424710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8019647-1774-483b-b11e-b478b894487f-secret-volume\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.438517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xz2n\" (UniqueName: \"kubernetes.io/projected/e8019647-1774-483b-b11e-b478b894487f-kube-api-access-4xz2n\") pod \"collect-profiles-29523255-vft5n\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.483775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:00 crc kubenswrapper[4707]: I0218 06:15:00.925619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n"] Feb 18 06:15:01 crc kubenswrapper[4707]: I0218 06:15:01.842931 4707 generic.go:334] "Generic (PLEG): container finished" podID="e8019647-1774-483b-b11e-b478b894487f" containerID="692387cbd9ffa3ca4a91d95e9bad67c1250a94a77fdb6d86a94157bbebbd572c" exitCode=0 Feb 18 06:15:01 crc kubenswrapper[4707]: I0218 06:15:01.843018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" event={"ID":"e8019647-1774-483b-b11e-b478b894487f","Type":"ContainerDied","Data":"692387cbd9ffa3ca4a91d95e9bad67c1250a94a77fdb6d86a94157bbebbd572c"} Feb 18 06:15:01 crc kubenswrapper[4707]: I0218 06:15:01.843259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" event={"ID":"e8019647-1774-483b-b11e-b478b894487f","Type":"ContainerStarted","Data":"91fe930b64c7ac5ae9e626b777ea98b0089182d5ca0b7765f40f14e2d45e3d8e"} Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.189277 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.377096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8019647-1774-483b-b11e-b478b894487f-config-volume\") pod \"e8019647-1774-483b-b11e-b478b894487f\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.377191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xz2n\" (UniqueName: \"kubernetes.io/projected/e8019647-1774-483b-b11e-b478b894487f-kube-api-access-4xz2n\") pod \"e8019647-1774-483b-b11e-b478b894487f\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.377378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8019647-1774-483b-b11e-b478b894487f-secret-volume\") pod \"e8019647-1774-483b-b11e-b478b894487f\" (UID: \"e8019647-1774-483b-b11e-b478b894487f\") " Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.378453 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8019647-1774-483b-b11e-b478b894487f-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8019647-1774-483b-b11e-b478b894487f" (UID: "e8019647-1774-483b-b11e-b478b894487f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.384083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8019647-1774-483b-b11e-b478b894487f-kube-api-access-4xz2n" (OuterVolumeSpecName: "kube-api-access-4xz2n") pod "e8019647-1774-483b-b11e-b478b894487f" (UID: "e8019647-1774-483b-b11e-b478b894487f"). InnerVolumeSpecName "kube-api-access-4xz2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.384457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8019647-1774-483b-b11e-b478b894487f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8019647-1774-483b-b11e-b478b894487f" (UID: "e8019647-1774-483b-b11e-b478b894487f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.479705 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8019647-1774-483b-b11e-b478b894487f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.479739 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xz2n\" (UniqueName: \"kubernetes.io/projected/e8019647-1774-483b-b11e-b478b894487f-kube-api-access-4xz2n\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.479751 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8019647-1774-483b-b11e-b478b894487f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.878573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" event={"ID":"e8019647-1774-483b-b11e-b478b894487f","Type":"ContainerDied","Data":"91fe930b64c7ac5ae9e626b777ea98b0089182d5ca0b7765f40f14e2d45e3d8e"} Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.879059 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91fe930b64c7ac5ae9e626b777ea98b0089182d5ca0b7765f40f14e2d45e3d8e" Feb 18 06:15:03 crc kubenswrapper[4707]: I0218 06:15:03.878766 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n" Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.046918 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-w556k"] Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.053748 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:15:12 crc kubenswrapper[4707]: E0218 06:15:12.054347 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.063963 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6da2-account-create-update-h65t8"] Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.069528 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wf8p8"] Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.078359 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6da2-account-create-update-h65t8"] Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.087389 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-w556k"] Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.095393 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wf8p8"] Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.103202 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-898jt"] Feb 18 06:15:12 crc kubenswrapper[4707]: I0218 06:15:12.111240 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-898jt"] Feb 18 06:15:13 crc kubenswrapper[4707]: I0218 06:15:13.032048 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9bf7-account-create-update-trgcj"] Feb 18 06:15:13 crc kubenswrapper[4707]: I0218 06:15:13.043593 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8c09-account-create-update-f5p6n"] Feb 18 06:15:13 crc kubenswrapper[4707]: I0218 06:15:13.053245 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9bf7-account-create-update-trgcj"] Feb 18 06:15:13 crc kubenswrapper[4707]: I0218 06:15:13.061599 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8c09-account-create-update-f5p6n"] Feb 18 06:15:14 crc kubenswrapper[4707]: I0218 06:15:14.072732 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b042550-a34c-44f7-9a49-29bb4f865dbd" path="/var/lib/kubelet/pods/0b042550-a34c-44f7-9a49-29bb4f865dbd/volumes" Feb 18 06:15:14 crc kubenswrapper[4707]: I0218 06:15:14.074280 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1439301a-c008-4af2-bb69-6857397051f3" path="/var/lib/kubelet/pods/1439301a-c008-4af2-bb69-6857397051f3/volumes" Feb 18 06:15:14 crc kubenswrapper[4707]: I0218 06:15:14.075432 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f42416-5e4a-475b-862e-71cb10661178" path="/var/lib/kubelet/pods/39f42416-5e4a-475b-862e-71cb10661178/volumes" Feb 18 06:15:14 crc kubenswrapper[4707]: I0218 06:15:14.076584 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ffc590-9556-4e8b-9faf-ed5df3a747a8" path="/var/lib/kubelet/pods/80ffc590-9556-4e8b-9faf-ed5df3a747a8/volumes" Feb 18 06:15:14 crc kubenswrapper[4707]: I0218 06:15:14.078330 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8496e6be-0819-4878-a823-31f90c5fd272" path="/var/lib/kubelet/pods/8496e6be-0819-4878-a823-31f90c5fd272/volumes" Feb 18 06:15:14 crc kubenswrapper[4707]: I0218 06:15:14.079407 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0fe62a2-3b67-40cf-8d6c-fd68f9667276" path="/var/lib/kubelet/pods/d0fe62a2-3b67-40cf-8d6c-fd68f9667276/volumes" Feb 18 06:15:23 crc kubenswrapper[4707]: I0218 06:15:23.054223 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:15:23 crc kubenswrapper[4707]: E0218 06:15:23.056137 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:15:35 crc kubenswrapper[4707]: I0218 06:15:35.030959 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gk84m"] Feb 18 06:15:35 crc kubenswrapper[4707]: I0218 06:15:35.042933 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gk84m"] Feb 18 06:15:36 crc kubenswrapper[4707]: I0218 06:15:36.063055 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ea2142-62af-4350-a711-2a7cbe23d990" path="/var/lib/kubelet/pods/41ea2142-62af-4350-a711-2a7cbe23d990/volumes" Feb 18 06:15:37 crc kubenswrapper[4707]: I0218 06:15:37.053533 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:15:37 crc kubenswrapper[4707]: E0218 06:15:37.054607 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.098975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-czbp6"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.105701 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dnt8j"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.113514 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hcll4"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.121741 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-861c-account-create-update-fnpxl"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.130531 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-6w79n"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.139110 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dnt8j"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.152998 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6939-account-create-update-747zz"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.165618 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hcll4"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.173700 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-czbp6"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.182041 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-861c-account-create-update-fnpxl"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.191360 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vn5mm"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.201846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-398f-account-create-update-rf487"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.209866 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3c9b-account-create-update-qs4bz"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.217859 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-6w79n"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.225043 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vn5mm"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.234634 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3c9b-account-create-update-qs4bz"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.243849 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-398f-account-create-update-rf487"] Feb 18 06:15:42 crc kubenswrapper[4707]: I0218 06:15:42.249139 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6939-account-create-update-747zz"] Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.071086 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b014870-bb4b-4241-bf1e-1b579389c879" path="/var/lib/kubelet/pods/1b014870-bb4b-4241-bf1e-1b579389c879/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.077625 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b8210d-4e51-4dae-b09c-dbf714c1ca7e" path="/var/lib/kubelet/pods/24b8210d-4e51-4dae-b09c-dbf714c1ca7e/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.084661 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b7b49a-3f2b-4961-ac42-b426af83bea2" path="/var/lib/kubelet/pods/73b7b49a-3f2b-4961-ac42-b426af83bea2/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.085419 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33a8c27-088e-4cc4-9447-3e0e2d1e3e85" path="/var/lib/kubelet/pods/a33a8c27-088e-4cc4-9447-3e0e2d1e3e85/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.086762 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0171309-d2f6-4ff6-bcd3-ac892477355c" path="/var/lib/kubelet/pods/c0171309-d2f6-4ff6-bcd3-ac892477355c/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.087933 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c15f7d-dfba-43f1-bb09-48d1ce940ed2" path="/var/lib/kubelet/pods/e2c15f7d-dfba-43f1-bb09-48d1ce940ed2/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.089329 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0ff79e-f841-48f7-9714-ca0c20783edf" path="/var/lib/kubelet/pods/ed0ff79e-f841-48f7-9714-ca0c20783edf/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.090145 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d" path="/var/lib/kubelet/pods/ed8406cd-9ed2-4335-a2b6-05c2ac1ccc0d/volumes" Feb 18 06:15:44 crc kubenswrapper[4707]: I0218 06:15:44.097435 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4e65cf-2f40-47a2-85b6-e46b9d2712f8" path="/var/lib/kubelet/pods/fc4e65cf-2f40-47a2-85b6-e46b9d2712f8/volumes" Feb 18 06:15:47 crc kubenswrapper[4707]: I0218 06:15:47.026282 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dhgkp"] Feb 18 06:15:47 crc kubenswrapper[4707]: I0218 06:15:47.034338 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dhgkp"] Feb 18 06:15:47 crc kubenswrapper[4707]: I0218 06:15:47.789911 4707 scope.go:117] "RemoveContainer" containerID="a1f56cd7ec7dbf1cfec672b58cc27ba61f8a0d7bbaf37a5c07a9e4a14ca8e713" Feb 18 06:15:47 crc kubenswrapper[4707]: I0218 06:15:47.813910 4707 scope.go:117] "RemoveContainer" containerID="93df32c77c64e39dd7b1ddb3f37b56c5a0eb5feee444f7f7657c564fd40e86cd" Feb 18 06:15:47 crc kubenswrapper[4707]: I0218 06:15:47.883353 4707 scope.go:117] "RemoveContainer" containerID="67ff09dce43603fcb7ef6b4a16eec7de5d913b3577a068bc70d3b28eb91ef8ae" Feb 18 06:15:47 crc kubenswrapper[4707]: I0218 06:15:47.933082 4707 scope.go:117] "RemoveContainer" containerID="4cfbcb18f285aab87f52594a6e5485f7310b9269b167ac8b6f9eb153215d6dd9" Feb 18 06:15:47 crc kubenswrapper[4707]: I0218 06:15:47.954710 4707 scope.go:117] "RemoveContainer" containerID="70a2eeea7de4052363d0b21f8e94ef8781499bdf50dec7837a09fc61afa0c6ac" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:47.999990 4707 scope.go:117] "RemoveContainer" containerID="1426c314ff7a3e186e8d92c82cc7e770a994448844a585fd6b5cec870c3bafcd" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.042919 4707 scope.go:117] "RemoveContainer" containerID="e6f3506bdc428e67019473cffb371b7e71fd6387146ddbd9cffc02a585aee599" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.063738 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4" path="/var/lib/kubelet/pods/0a675c2d-6f3e-46d9-8efe-7f7e5570b9a4/volumes" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.066962 4707 scope.go:117] "RemoveContainer" containerID="d5e9efaaa534d435389bcc215327bbea476ca3c9b3fa37945bf2de3248f0a6f8" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.090070 4707 scope.go:117] "RemoveContainer" containerID="7e661fde4df97830aca9914df1d357493866ccfdb1951939d2fd404fe4456c62" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.111543 4707 scope.go:117] "RemoveContainer" containerID="554122bf0bb556257b07f9548a7bc6c40cc2a78ce08254bfae8c87e86ef6300f" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.132231 4707 scope.go:117] "RemoveContainer" containerID="8710a9fc36de1d8ea404a56cc48948abfebf625887957d5fe45131c8192c3be2" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.157057 4707 scope.go:117] "RemoveContainer" containerID="4cda4ee41b31596d77a33c53f558aab4c37e2a475cfb4eb9cf6fceace1caa048" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.175274 4707 scope.go:117] "RemoveContainer" containerID="48294d1862605c74084bdb820b5a7da371f2eb93b3dafc638792122652216800" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.194410 4707 scope.go:117] "RemoveContainer" containerID="7dbc8d02dc0d56f344cc7f14a04727605f837de52038bcd6d542f740b27700ec" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.212932 4707 scope.go:117] "RemoveContainer" containerID="dfce4a2781663f21df1a056f62db0f42e7fd992c53179ed56a4a5938f2814a4f" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.245240 4707 scope.go:117] "RemoveContainer" containerID="ac11c4a17e423f0e89027de29bb407c5966f3712ef77abad08e296ce9691f732" Feb 18 06:15:48 crc kubenswrapper[4707]: I0218 06:15:48.276814 4707 scope.go:117] "RemoveContainer" containerID="3becbb6e025eeb676be7a213ab27f3ae08405832a9869a73829c02bd1034c8d6" Feb 18 06:15:50 crc kubenswrapper[4707]: I0218 06:15:50.053467 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:15:50 crc kubenswrapper[4707]: E0218 06:15:50.054359 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:16:01 crc kubenswrapper[4707]: I0218 06:16:01.054038 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:16:01 crc kubenswrapper[4707]: E0218 06:16:01.055154 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:16:03 crc kubenswrapper[4707]: I0218 06:16:03.459148 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7bc2edd-9db2-40df-be54-0db1c1b462fa" containerID="9a34633e085836198f88ad85199c289849d3c4ad0f9b0062daf12e8ffa39d69d" exitCode=0 Feb 18 06:16:03 crc kubenswrapper[4707]: I0218 06:16:03.459239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" event={"ID":"d7bc2edd-9db2-40df-be54-0db1c1b462fa","Type":"ContainerDied","Data":"9a34633e085836198f88ad85199c289849d3c4ad0f9b0062daf12e8ffa39d69d"} Feb 18 06:16:04 crc kubenswrapper[4707]: I0218 06:16:04.956154 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.016711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-ssh-key-openstack-edpm-ipam\") pod \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.016766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-inventory\") pod \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.016815 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7668p\" (UniqueName: \"kubernetes.io/projected/d7bc2edd-9db2-40df-be54-0db1c1b462fa-kube-api-access-7668p\") pod \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\" (UID: \"d7bc2edd-9db2-40df-be54-0db1c1b462fa\") " Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.045754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-inventory" (OuterVolumeSpecName: "inventory") pod "d7bc2edd-9db2-40df-be54-0db1c1b462fa" (UID: "d7bc2edd-9db2-40df-be54-0db1c1b462fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.047477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bc2edd-9db2-40df-be54-0db1c1b462fa-kube-api-access-7668p" (OuterVolumeSpecName: "kube-api-access-7668p") pod "d7bc2edd-9db2-40df-be54-0db1c1b462fa" (UID: "d7bc2edd-9db2-40df-be54-0db1c1b462fa"). InnerVolumeSpecName "kube-api-access-7668p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.061221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d7bc2edd-9db2-40df-be54-0db1c1b462fa" (UID: "d7bc2edd-9db2-40df-be54-0db1c1b462fa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.119901 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.119936 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7bc2edd-9db2-40df-be54-0db1c1b462fa-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.119947 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7668p\" (UniqueName: \"kubernetes.io/projected/d7bc2edd-9db2-40df-be54-0db1c1b462fa-kube-api-access-7668p\") on node \"crc\" DevicePath \"\"" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.478753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" event={"ID":"d7bc2edd-9db2-40df-be54-0db1c1b462fa","Type":"ContainerDied","Data":"d0a788f7b61936e4dd747f0ab3a41d619c45ee4f8d00a35e97279b573885f60b"} Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.479158 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0a788f7b61936e4dd747f0ab3a41d619c45ee4f8d00a35e97279b573885f60b" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.478893 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xtr45" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.566540 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm"] Feb 18 06:16:05 crc kubenswrapper[4707]: E0218 06:16:05.567299 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8019647-1774-483b-b11e-b478b894487f" containerName="collect-profiles" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.567328 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8019647-1774-483b-b11e-b478b894487f" containerName="collect-profiles" Feb 18 06:16:05 crc kubenswrapper[4707]: E0218 06:16:05.567370 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bc2edd-9db2-40df-be54-0db1c1b462fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.567378 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bc2edd-9db2-40df-be54-0db1c1b462fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.567556 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bc2edd-9db2-40df-be54-0db1c1b462fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.567582 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8019647-1774-483b-b11e-b478b894487f" containerName="collect-profiles" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.568352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.576510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm"] Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.579499 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.579689 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.579834 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.579950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.632218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjpb\" (UniqueName: \"kubernetes.io/projected/18d9274e-1766-4a10-9522-568030d5db64-kube-api-access-ncjpb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.632388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.632419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.733996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.734055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.734193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncjpb\" (UniqueName: \"kubernetes.io/projected/18d9274e-1766-4a10-9522-568030d5db64-kube-api-access-ncjpb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.738919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.745853 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.750902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncjpb\" (UniqueName: \"kubernetes.io/projected/18d9274e-1766-4a10-9522-568030d5db64-kube-api-access-ncjpb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-557tm\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:05 crc kubenswrapper[4707]: I0218 06:16:05.898924 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:16:06 crc kubenswrapper[4707]: I0218 06:16:06.408114 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm"] Feb 18 06:16:06 crc kubenswrapper[4707]: W0218 06:16:06.418003 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d9274e_1766_4a10_9522_568030d5db64.slice/crio-63421e330768fb01948d7938e7fde93ec97944dd158e8e112c54345d74211ab0 WatchSource:0}: Error finding container 63421e330768fb01948d7938e7fde93ec97944dd158e8e112c54345d74211ab0: Status 404 returned error can't find the container with id 63421e330768fb01948d7938e7fde93ec97944dd158e8e112c54345d74211ab0 Feb 18 06:16:06 crc kubenswrapper[4707]: I0218 06:16:06.488041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" event={"ID":"18d9274e-1766-4a10-9522-568030d5db64","Type":"ContainerStarted","Data":"63421e330768fb01948d7938e7fde93ec97944dd158e8e112c54345d74211ab0"} Feb 18 06:16:07 crc kubenswrapper[4707]: I0218 06:16:07.496443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" event={"ID":"18d9274e-1766-4a10-9522-568030d5db64","Type":"ContainerStarted","Data":"64fa4885f35d2bcf866816c0e26f4b69ffba5d88b203c62536b3d6098e12e781"} Feb 18 06:16:07 crc kubenswrapper[4707]: I0218 06:16:07.520056 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" podStartSLOduration=2.033085708 podStartE2EDuration="2.520040108s" podCreationTimestamp="2026-02-18 06:16:05 +0000 UTC" firstStartedPulling="2026-02-18 06:16:06.419772411 +0000 UTC m=+1703.067731545" lastFinishedPulling="2026-02-18 06:16:06.906726811 +0000 UTC m=+1703.554685945" observedRunningTime="2026-02-18 06:16:07.514717465 +0000 UTC m=+1704.162676599" watchObservedRunningTime="2026-02-18 06:16:07.520040108 +0000 UTC m=+1704.167999242" Feb 18 06:16:14 crc kubenswrapper[4707]: I0218 06:16:14.059128 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:16:14 crc kubenswrapper[4707]: E0218 06:16:14.059651 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:16:18 crc kubenswrapper[4707]: I0218 06:16:18.047766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-t45dj"] Feb 18 06:16:18 crc kubenswrapper[4707]: I0218 06:16:18.069461 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-t45dj"] Feb 18 06:16:20 crc kubenswrapper[4707]: I0218 06:16:20.071889 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668b58a1-6a64-4356-a00f-ccf6faa1ce3b" path="/var/lib/kubelet/pods/668b58a1-6a64-4356-a00f-ccf6faa1ce3b/volumes" Feb 18 06:16:24 crc kubenswrapper[4707]: I0218 06:16:24.046109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4gf8v"] Feb 18 06:16:24 crc kubenswrapper[4707]: I0218 06:16:24.064602 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4gf8v"] Feb 18 06:16:26 crc kubenswrapper[4707]: I0218 06:16:26.080855 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:16:26 crc kubenswrapper[4707]: E0218 06:16:26.081438 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:16:26 crc kubenswrapper[4707]: I0218 06:16:26.094634 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b6874e-b7bf-4d36-8ea1-66cf492ba327" path="/var/lib/kubelet/pods/27b6874e-b7bf-4d36-8ea1-66cf492ba327/volumes" Feb 18 06:16:27 crc kubenswrapper[4707]: I0218 06:16:27.030987 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5qczm"] Feb 18 06:16:27 crc kubenswrapper[4707]: I0218 06:16:27.039523 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5qczm"] Feb 18 06:16:28 crc kubenswrapper[4707]: I0218 06:16:28.068174 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2835aa4-2296-4af8-b099-af250380f599" path="/var/lib/kubelet/pods/a2835aa4-2296-4af8-b099-af250380f599/volumes" Feb 18 06:16:37 crc kubenswrapper[4707]: I0218 06:16:37.053285 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:16:37 crc kubenswrapper[4707]: E0218 06:16:37.054125 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:16:46 crc kubenswrapper[4707]: I0218 06:16:46.034961 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bpqnj"] Feb 18 06:16:46 crc kubenswrapper[4707]: I0218 06:16:46.044459 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bpqnj"] Feb 18 06:16:46 crc kubenswrapper[4707]: I0218 06:16:46.062561 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f8ab203-7caa-4df6-9c0c-599a9d1b9612" path="/var/lib/kubelet/pods/3f8ab203-7caa-4df6-9c0c-599a9d1b9612/volumes" Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.030161 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4kl7b"] Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.038387 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-w2p4x"] Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.047761 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4kl7b"] Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.053386 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:16:48 crc kubenswrapper[4707]: E0218 06:16:48.053913 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.064438 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f975a7-ca3d-4940-9281-051360f67955" path="/var/lib/kubelet/pods/e5f975a7-ca3d-4940-9281-051360f67955/volumes" Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.065239 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-w2p4x"] Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.547072 4707 scope.go:117] "RemoveContainer" containerID="279f5535fb47a6f51575f5c80b1d90fab834c7b8ae4cc1cbc8af455923031057" Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.577576 4707 scope.go:117] "RemoveContainer" containerID="3a93156ee1b4f3e91620c4bde12236aa1e4ef2c92c4112e24d4e0a8a2e806da6" Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.640084 4707 scope.go:117] "RemoveContainer" containerID="1ebbb7fc9bc31795cd57981b0878f89911fcc659b97fd197bce36c9a99dac95d" Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.669557 4707 scope.go:117] "RemoveContainer" containerID="36085a9410c8c53d98b5847c5b00a965f89d979f3a53a99d89fba2d10f3760e8" Feb 18 06:16:48 crc kubenswrapper[4707]: I0218 06:16:48.713315 4707 scope.go:117] "RemoveContainer" containerID="5a7d6a439a0870e8147b64adeac25288771e67e2e5bbb4437da0e31d0cb51c02" Feb 18 06:16:50 crc kubenswrapper[4707]: I0218 06:16:50.062826 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0301370e-0d52-4549-93ba-033d6d706508" path="/var/lib/kubelet/pods/0301370e-0d52-4549-93ba-033d6d706508/volumes" Feb 18 06:17:00 crc kubenswrapper[4707]: I0218 06:17:00.053637 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:17:00 crc kubenswrapper[4707]: E0218 06:17:00.054540 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:17:15 crc kubenswrapper[4707]: I0218 06:17:15.053098 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:17:15 crc kubenswrapper[4707]: E0218 06:17:15.053879 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:17:16 crc kubenswrapper[4707]: I0218 06:17:16.529542 4707 generic.go:334] "Generic (PLEG): container finished" podID="18d9274e-1766-4a10-9522-568030d5db64" containerID="64fa4885f35d2bcf866816c0e26f4b69ffba5d88b203c62536b3d6098e12e781" exitCode=0 Feb 18 06:17:16 crc kubenswrapper[4707]: I0218 06:17:16.529752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" event={"ID":"18d9274e-1766-4a10-9522-568030d5db64","Type":"ContainerDied","Data":"64fa4885f35d2bcf866816c0e26f4b69ffba5d88b203c62536b3d6098e12e781"} Feb 18 06:17:17 crc kubenswrapper[4707]: I0218 06:17:17.962167 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.150974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncjpb\" (UniqueName: \"kubernetes.io/projected/18d9274e-1766-4a10-9522-568030d5db64-kube-api-access-ncjpb\") pod \"18d9274e-1766-4a10-9522-568030d5db64\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.151169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-ssh-key-openstack-edpm-ipam\") pod \"18d9274e-1766-4a10-9522-568030d5db64\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.151376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-inventory\") pod \"18d9274e-1766-4a10-9522-568030d5db64\" (UID: \"18d9274e-1766-4a10-9522-568030d5db64\") " Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.159122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d9274e-1766-4a10-9522-568030d5db64-kube-api-access-ncjpb" (OuterVolumeSpecName: "kube-api-access-ncjpb") pod "18d9274e-1766-4a10-9522-568030d5db64" (UID: "18d9274e-1766-4a10-9522-568030d5db64"). InnerVolumeSpecName "kube-api-access-ncjpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.180032 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "18d9274e-1766-4a10-9522-568030d5db64" (UID: "18d9274e-1766-4a10-9522-568030d5db64"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.180552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-inventory" (OuterVolumeSpecName: "inventory") pod "18d9274e-1766-4a10-9522-568030d5db64" (UID: "18d9274e-1766-4a10-9522-568030d5db64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.254548 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.254586 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d9274e-1766-4a10-9522-568030d5db64-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.254597 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncjpb\" (UniqueName: \"kubernetes.io/projected/18d9274e-1766-4a10-9522-568030d5db64-kube-api-access-ncjpb\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.552900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" event={"ID":"18d9274e-1766-4a10-9522-568030d5db64","Type":"ContainerDied","Data":"63421e330768fb01948d7938e7fde93ec97944dd158e8e112c54345d74211ab0"} Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.552943 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63421e330768fb01948d7938e7fde93ec97944dd158e8e112c54345d74211ab0" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.552998 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-557tm" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.650026 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl"] Feb 18 06:17:18 crc kubenswrapper[4707]: E0218 06:17:18.650481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d9274e-1766-4a10-9522-568030d5db64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.650502 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d9274e-1766-4a10-9522-568030d5db64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.650753 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d9274e-1766-4a10-9522-568030d5db64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.651610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.654558 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.654770 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.655731 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.655933 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.672258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl"] Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.764285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.764390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jmfs\" (UniqueName: \"kubernetes.io/projected/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-kube-api-access-7jmfs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.765388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.867034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.867453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jmfs\" (UniqueName: \"kubernetes.io/projected/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-kube-api-access-7jmfs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.867611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.872570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.872906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.889569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jmfs\" (UniqueName: \"kubernetes.io/projected/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-kube-api-access-7jmfs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:18 crc kubenswrapper[4707]: I0218 06:17:18.972548 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:19 crc kubenswrapper[4707]: I0218 06:17:19.491486 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl"] Feb 18 06:17:19 crc kubenswrapper[4707]: I0218 06:17:19.499586 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:17:19 crc kubenswrapper[4707]: I0218 06:17:19.561717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" event={"ID":"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68","Type":"ContainerStarted","Data":"470fba1659fd1bc49ce76243edb8c06ff0d4abaf9c8df503f46aa14698b30ca9"} Feb 18 06:17:20 crc kubenswrapper[4707]: I0218 06:17:20.572738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" event={"ID":"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68","Type":"ContainerStarted","Data":"6c97df0becf4beb13750d58e3752b42deb09564f3df8818a2a4968bacacc3490"} Feb 18 06:17:25 crc kubenswrapper[4707]: I0218 06:17:25.618782 4707 generic.go:334] "Generic (PLEG): container finished" podID="6cfdc829-6a01-4b1b-b774-5b7a0ff96d68" containerID="6c97df0becf4beb13750d58e3752b42deb09564f3df8818a2a4968bacacc3490" exitCode=0 Feb 18 06:17:25 crc kubenswrapper[4707]: I0218 06:17:25.618856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" event={"ID":"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68","Type":"ContainerDied","Data":"6c97df0becf4beb13750d58e3752b42deb09564f3df8818a2a4968bacacc3490"} Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.038983 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8582-account-create-update-d45wr"] Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.048319 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8582-account-create-update-d45wr"] Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.074850 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.220966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jmfs\" (UniqueName: \"kubernetes.io/projected/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-kube-api-access-7jmfs\") pod \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.221032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-ssh-key-openstack-edpm-ipam\") pod \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.221096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-inventory\") pod \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\" (UID: \"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68\") " Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.227312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-kube-api-access-7jmfs" (OuterVolumeSpecName: "kube-api-access-7jmfs") pod "6cfdc829-6a01-4b1b-b774-5b7a0ff96d68" (UID: "6cfdc829-6a01-4b1b-b774-5b7a0ff96d68"). InnerVolumeSpecName "kube-api-access-7jmfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.252313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6cfdc829-6a01-4b1b-b774-5b7a0ff96d68" (UID: "6cfdc829-6a01-4b1b-b774-5b7a0ff96d68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.254052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-inventory" (OuterVolumeSpecName: "inventory") pod "6cfdc829-6a01-4b1b-b774-5b7a0ff96d68" (UID: "6cfdc829-6a01-4b1b-b774-5b7a0ff96d68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.323248 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jmfs\" (UniqueName: \"kubernetes.io/projected/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-kube-api-access-7jmfs\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.323276 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.323286 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cfdc829-6a01-4b1b-b774-5b7a0ff96d68-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.637214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" event={"ID":"6cfdc829-6a01-4b1b-b774-5b7a0ff96d68","Type":"ContainerDied","Data":"470fba1659fd1bc49ce76243edb8c06ff0d4abaf9c8df503f46aa14698b30ca9"} Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.637258 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="470fba1659fd1bc49ce76243edb8c06ff0d4abaf9c8df503f46aa14698b30ca9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.637289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.723759 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9"] Feb 18 06:17:27 crc kubenswrapper[4707]: E0218 06:17:27.724250 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfdc829-6a01-4b1b-b774-5b7a0ff96d68" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.724269 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfdc829-6a01-4b1b-b774-5b7a0ff96d68" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.724460 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfdc829-6a01-4b1b-b774-5b7a0ff96d68" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.725124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.726825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.726956 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.727379 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.733937 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9"] Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.751910 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.832074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfm89\" (UniqueName: \"kubernetes.io/projected/e3939063-5ede-47de-8c02-a46756c148b5-kube-api-access-hfm89\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.832945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.833164 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.934165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.934268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfm89\" (UniqueName: \"kubernetes.io/projected/e3939063-5ede-47de-8c02-a46756c148b5-kube-api-access-hfm89\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.934360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.940231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.943748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:27 crc kubenswrapper[4707]: I0218 06:17:27.952490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfm89\" (UniqueName: \"kubernetes.io/projected/e3939063-5ede-47de-8c02-a46756c148b5-kube-api-access-hfm89\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-45gf9\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.040553 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f78d-account-create-update-tw92l"] Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.069764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.077051 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.090704 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868cb81a-07d6-4515-8129-32c1e5d06ca7" path="/var/lib/kubelet/pods/868cb81a-07d6-4515-8129-32c1e5d06ca7/volumes" Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.091869 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ts677"] Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.094729 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-f5brp"] Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.103561 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f78d-account-create-update-tw92l"] Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.112343 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-f5brp"] Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.120008 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ts677"] Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.598552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9"] Feb 18 06:17:28 crc kubenswrapper[4707]: W0218 06:17:28.598630 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3939063_5ede_47de_8c02_a46756c148b5.slice/crio-d55f3d2755613fdd3049d835a7d08f5284b19ff89bf163c8f23eecf3e7e3c873 WatchSource:0}: Error finding container d55f3d2755613fdd3049d835a7d08f5284b19ff89bf163c8f23eecf3e7e3c873: Status 404 returned error can't find the container with id d55f3d2755613fdd3049d835a7d08f5284b19ff89bf163c8f23eecf3e7e3c873 Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.649086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"b793e97ac0e534321a4fdc530604143a86bad11d81b78baa1a5c35dfbdc0cbf8"} Feb 18 06:17:28 crc kubenswrapper[4707]: I0218 06:17:28.651820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" event={"ID":"e3939063-5ede-47de-8c02-a46756c148b5","Type":"ContainerStarted","Data":"d55f3d2755613fdd3049d835a7d08f5284b19ff89bf163c8f23eecf3e7e3c873"} Feb 18 06:17:29 crc kubenswrapper[4707]: I0218 06:17:29.049971 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-r5xft"] Feb 18 06:17:29 crc kubenswrapper[4707]: I0218 06:17:29.065428 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-r5xft"] Feb 18 06:17:29 crc kubenswrapper[4707]: I0218 06:17:29.076810 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f40b-account-create-update-55rmw"] Feb 18 06:17:29 crc kubenswrapper[4707]: I0218 06:17:29.085570 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f40b-account-create-update-55rmw"] Feb 18 06:17:29 crc kubenswrapper[4707]: I0218 06:17:29.661120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" event={"ID":"e3939063-5ede-47de-8c02-a46756c148b5","Type":"ContainerStarted","Data":"cf081a59adc735effa72ac40ea93dcbd675c735f798a7fa282db9f181579a2ea"} Feb 18 06:17:29 crc kubenswrapper[4707]: I0218 06:17:29.691379 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" podStartSLOduration=2.300787385 podStartE2EDuration="2.691356907s" podCreationTimestamp="2026-02-18 06:17:27 +0000 UTC" firstStartedPulling="2026-02-18 06:17:28.601135978 +0000 UTC m=+1785.249095112" lastFinishedPulling="2026-02-18 06:17:28.9917055 +0000 UTC m=+1785.639664634" observedRunningTime="2026-02-18 06:17:29.683397452 +0000 UTC m=+1786.331356586" watchObservedRunningTime="2026-02-18 06:17:29.691356907 +0000 UTC m=+1786.339316041" Feb 18 06:17:30 crc kubenswrapper[4707]: I0218 06:17:30.074376 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c750576-ce13-4f43-a430-80a8d84b7829" path="/var/lib/kubelet/pods/3c750576-ce13-4f43-a430-80a8d84b7829/volumes" Feb 18 06:17:30 crc kubenswrapper[4707]: I0218 06:17:30.075504 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e387c5-5644-44e5-9479-901efc3f88e8" path="/var/lib/kubelet/pods/57e387c5-5644-44e5-9479-901efc3f88e8/volumes" Feb 18 06:17:30 crc kubenswrapper[4707]: I0218 06:17:30.076726 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9ed69c-f65c-4a44-bf84-f9f911905bce" path="/var/lib/kubelet/pods/8b9ed69c-f65c-4a44-bf84-f9f911905bce/volumes" Feb 18 06:17:30 crc kubenswrapper[4707]: I0218 06:17:30.079966 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b279b0e7-ca70-4cfd-92a9-c90f10658f69" path="/var/lib/kubelet/pods/b279b0e7-ca70-4cfd-92a9-c90f10658f69/volumes" Feb 18 06:17:30 crc kubenswrapper[4707]: I0218 06:17:30.081243 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd09444c-e4f0-4a3b-afeb-6841e197b017" path="/var/lib/kubelet/pods/dd09444c-e4f0-4a3b-afeb-6841e197b017/volumes" Feb 18 06:17:48 crc kubenswrapper[4707]: I0218 06:17:48.875427 4707 scope.go:117] "RemoveContainer" containerID="6a4e495c468efb5e133c7e8317d4ae4053e8d18a7c9e204f0a081d8d8e68e244" Feb 18 06:17:48 crc kubenswrapper[4707]: I0218 06:17:48.897591 4707 scope.go:117] "RemoveContainer" containerID="b58071643fc0e14e0134e48024690ff550ef5996252cb9bc0688b929107e83ea" Feb 18 06:17:48 crc kubenswrapper[4707]: I0218 06:17:48.962278 4707 scope.go:117] "RemoveContainer" containerID="add97642b7ba0a4cf22007e71c4dc74213e686f33d7b6829ec86917b16faab4a" Feb 18 06:17:49 crc kubenswrapper[4707]: I0218 06:17:49.000761 4707 scope.go:117] "RemoveContainer" containerID="41d8068787160a33336b10393dce5cd5dbb2fb1f382dfb8518fa0426ef1569e8" Feb 18 06:17:49 crc kubenswrapper[4707]: I0218 06:17:49.051213 4707 scope.go:117] "RemoveContainer" containerID="683c1f6417d107a510ab78429d2e1fb2aade6e86dc4a7ef8a7efc21f5a865e0c" Feb 18 06:17:49 crc kubenswrapper[4707]: I0218 06:17:49.081436 4707 scope.go:117] "RemoveContainer" containerID="2100f4326fde2edef8d66ac7a2809ddcb820e8c0e95af02e9cdae51b0602aa65" Feb 18 06:17:49 crc kubenswrapper[4707]: I0218 06:17:49.133466 4707 scope.go:117] "RemoveContainer" containerID="3d4269a3dd801c28cf9dceeee0566fb80237c282e501ec50aa6498dd38d90b4b" Feb 18 06:18:01 crc kubenswrapper[4707]: I0218 06:18:01.048014 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77pfl"] Feb 18 06:18:01 crc kubenswrapper[4707]: I0218 06:18:01.061444 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77pfl"] Feb 18 06:18:02 crc kubenswrapper[4707]: I0218 06:18:02.063956 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef28baea-267b-4872-bc1d-036169e0a9f2" path="/var/lib/kubelet/pods/ef28baea-267b-4872-bc1d-036169e0a9f2/volumes" Feb 18 06:18:06 crc kubenswrapper[4707]: I0218 06:18:06.028319 4707 generic.go:334] "Generic (PLEG): container finished" podID="e3939063-5ede-47de-8c02-a46756c148b5" containerID="cf081a59adc735effa72ac40ea93dcbd675c735f798a7fa282db9f181579a2ea" exitCode=0 Feb 18 06:18:06 crc kubenswrapper[4707]: I0218 06:18:06.028454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" event={"ID":"e3939063-5ede-47de-8c02-a46756c148b5","Type":"ContainerDied","Data":"cf081a59adc735effa72ac40ea93dcbd675c735f798a7fa282db9f181579a2ea"} Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.419045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.476294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfm89\" (UniqueName: \"kubernetes.io/projected/e3939063-5ede-47de-8c02-a46756c148b5-kube-api-access-hfm89\") pod \"e3939063-5ede-47de-8c02-a46756c148b5\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.476417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-ssh-key-openstack-edpm-ipam\") pod \"e3939063-5ede-47de-8c02-a46756c148b5\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.476624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-inventory\") pod \"e3939063-5ede-47de-8c02-a46756c148b5\" (UID: \"e3939063-5ede-47de-8c02-a46756c148b5\") " Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.482639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3939063-5ede-47de-8c02-a46756c148b5-kube-api-access-hfm89" (OuterVolumeSpecName: "kube-api-access-hfm89") pod "e3939063-5ede-47de-8c02-a46756c148b5" (UID: "e3939063-5ede-47de-8c02-a46756c148b5"). InnerVolumeSpecName "kube-api-access-hfm89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.502907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e3939063-5ede-47de-8c02-a46756c148b5" (UID: "e3939063-5ede-47de-8c02-a46756c148b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.504155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-inventory" (OuterVolumeSpecName: "inventory") pod "e3939063-5ede-47de-8c02-a46756c148b5" (UID: "e3939063-5ede-47de-8c02-a46756c148b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.578036 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.578071 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfm89\" (UniqueName: \"kubernetes.io/projected/e3939063-5ede-47de-8c02-a46756c148b5-kube-api-access-hfm89\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:07 crc kubenswrapper[4707]: I0218 06:18:07.578083 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3939063-5ede-47de-8c02-a46756c148b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.047326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" event={"ID":"e3939063-5ede-47de-8c02-a46756c148b5","Type":"ContainerDied","Data":"d55f3d2755613fdd3049d835a7d08f5284b19ff89bf163c8f23eecf3e7e3c873"} Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.047373 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d55f3d2755613fdd3049d835a7d08f5284b19ff89bf163c8f23eecf3e7e3c873" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.047450 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-45gf9" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.159228 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs"] Feb 18 06:18:08 crc kubenswrapper[4707]: E0218 06:18:08.159970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3939063-5ede-47de-8c02-a46756c148b5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.159995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3939063-5ede-47de-8c02-a46756c148b5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.160253 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3939063-5ede-47de-8c02-a46756c148b5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.160879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.162940 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.163140 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.163874 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.164035 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.178098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs"] Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.290329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.290398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.290433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4qrp\" (UniqueName: \"kubernetes.io/projected/beb9134a-dfca-4e8d-be56-0e0980d32bc8-kube-api-access-c4qrp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.392405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.392485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.392563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4qrp\" (UniqueName: \"kubernetes.io/projected/beb9134a-dfca-4e8d-be56-0e0980d32bc8-kube-api-access-c4qrp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.398205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.398501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.411864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4qrp\" (UniqueName: \"kubernetes.io/projected/beb9134a-dfca-4e8d-be56-0e0980d32bc8-kube-api-access-c4qrp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlvs\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:08 crc kubenswrapper[4707]: I0218 06:18:08.482101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:18:09 crc kubenswrapper[4707]: I0218 06:18:09.061404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs"] Feb 18 06:18:09 crc kubenswrapper[4707]: W0218 06:18:09.067747 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeb9134a_dfca_4e8d_be56_0e0980d32bc8.slice/crio-5b2d5c55c9a4fb6755330c3c633bff0a4021f2ccea735b72b9821e89d9ac198b WatchSource:0}: Error finding container 5b2d5c55c9a4fb6755330c3c633bff0a4021f2ccea735b72b9821e89d9ac198b: Status 404 returned error can't find the container with id 5b2d5c55c9a4fb6755330c3c633bff0a4021f2ccea735b72b9821e89d9ac198b Feb 18 06:18:10 crc kubenswrapper[4707]: I0218 06:18:10.066948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" event={"ID":"beb9134a-dfca-4e8d-be56-0e0980d32bc8","Type":"ContainerStarted","Data":"1e037f57978dccbd9ef524ca208cd727121853d4ba6fc97351a42e9907b418a6"} Feb 18 06:18:10 crc kubenswrapper[4707]: I0218 06:18:10.067581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" event={"ID":"beb9134a-dfca-4e8d-be56-0e0980d32bc8","Type":"ContainerStarted","Data":"5b2d5c55c9a4fb6755330c3c633bff0a4021f2ccea735b72b9821e89d9ac198b"} Feb 18 06:18:10 crc kubenswrapper[4707]: I0218 06:18:10.084049 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" podStartSLOduration=1.6985453640000001 podStartE2EDuration="2.084031999s" podCreationTimestamp="2026-02-18 06:18:08 +0000 UTC" firstStartedPulling="2026-02-18 06:18:09.070369405 +0000 UTC m=+1825.718328539" lastFinishedPulling="2026-02-18 06:18:09.45585604 +0000 UTC m=+1826.103815174" observedRunningTime="2026-02-18 06:18:10.083707441 +0000 UTC m=+1826.731666595" watchObservedRunningTime="2026-02-18 06:18:10.084031999 +0000 UTC m=+1826.731991133" Feb 18 06:18:49 crc kubenswrapper[4707]: I0218 06:18:49.257373 4707 scope.go:117] "RemoveContainer" containerID="0b4f1333566d26b8362f423ce93b5519a653c4f378e39637bb3fd75d9247ad78" Feb 18 06:18:58 crc kubenswrapper[4707]: I0218 06:18:58.069379 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nnl9x"] Feb 18 06:18:58 crc kubenswrapper[4707]: I0218 06:18:58.071484 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gbm4m"] Feb 18 06:18:58 crc kubenswrapper[4707]: I0218 06:18:58.081877 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gbm4m"] Feb 18 06:18:58 crc kubenswrapper[4707]: I0218 06:18:58.090761 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nnl9x"] Feb 18 06:18:59 crc kubenswrapper[4707]: I0218 06:18:59.554483 4707 generic.go:334] "Generic (PLEG): container finished" podID="beb9134a-dfca-4e8d-be56-0e0980d32bc8" containerID="1e037f57978dccbd9ef524ca208cd727121853d4ba6fc97351a42e9907b418a6" exitCode=0 Feb 18 06:18:59 crc kubenswrapper[4707]: I0218 06:18:59.554590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" event={"ID":"beb9134a-dfca-4e8d-be56-0e0980d32bc8","Type":"ContainerDied","Data":"1e037f57978dccbd9ef524ca208cd727121853d4ba6fc97351a42e9907b418a6"} Feb 18 06:19:00 crc kubenswrapper[4707]: I0218 06:19:00.064363 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693f7b53-7a5c-42a3-b5e7-9fc96d46b78a" path="/var/lib/kubelet/pods/693f7b53-7a5c-42a3-b5e7-9fc96d46b78a/volumes" Feb 18 06:19:00 crc kubenswrapper[4707]: I0218 06:19:00.065002 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac58976-c8d4-406d-abf2-055b212106d1" path="/var/lib/kubelet/pods/7ac58976-c8d4-406d-abf2-055b212106d1/volumes" Feb 18 06:19:00 crc kubenswrapper[4707]: I0218 06:19:00.976224 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.080681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4qrp\" (UniqueName: \"kubernetes.io/projected/beb9134a-dfca-4e8d-be56-0e0980d32bc8-kube-api-access-c4qrp\") pod \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.081046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-inventory\") pod \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.081325 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-ssh-key-openstack-edpm-ipam\") pod \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\" (UID: \"beb9134a-dfca-4e8d-be56-0e0980d32bc8\") " Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.090656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb9134a-dfca-4e8d-be56-0e0980d32bc8-kube-api-access-c4qrp" (OuterVolumeSpecName: "kube-api-access-c4qrp") pod "beb9134a-dfca-4e8d-be56-0e0980d32bc8" (UID: "beb9134a-dfca-4e8d-be56-0e0980d32bc8"). InnerVolumeSpecName "kube-api-access-c4qrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.114414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-inventory" (OuterVolumeSpecName: "inventory") pod "beb9134a-dfca-4e8d-be56-0e0980d32bc8" (UID: "beb9134a-dfca-4e8d-be56-0e0980d32bc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.117282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "beb9134a-dfca-4e8d-be56-0e0980d32bc8" (UID: "beb9134a-dfca-4e8d-be56-0e0980d32bc8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.184766 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.184956 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4qrp\" (UniqueName: \"kubernetes.io/projected/beb9134a-dfca-4e8d-be56-0e0980d32bc8-kube-api-access-c4qrp\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.185022 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb9134a-dfca-4e8d-be56-0e0980d32bc8-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.572986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" event={"ID":"beb9134a-dfca-4e8d-be56-0e0980d32bc8","Type":"ContainerDied","Data":"5b2d5c55c9a4fb6755330c3c633bff0a4021f2ccea735b72b9821e89d9ac198b"} Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.573332 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b2d5c55c9a4fb6755330c3c633bff0a4021f2ccea735b72b9821e89d9ac198b" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.573015 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlvs" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.660923 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rj9p"] Feb 18 06:19:01 crc kubenswrapper[4707]: E0218 06:19:01.661309 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb9134a-dfca-4e8d-be56-0e0980d32bc8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.661326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb9134a-dfca-4e8d-be56-0e0980d32bc8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.661531 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb9134a-dfca-4e8d-be56-0e0980d32bc8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.662121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.664960 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.665016 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.665154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.665628 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.674675 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rj9p"] Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.797217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.797272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.797313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nj96\" (UniqueName: \"kubernetes.io/projected/eda30b1a-96f0-425e-908d-4846ffe8c3bb-kube-api-access-8nj96\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.899255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.899312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.899346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nj96\" (UniqueName: \"kubernetes.io/projected/eda30b1a-96f0-425e-908d-4846ffe8c3bb-kube-api-access-8nj96\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.911441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.911680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.918163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nj96\" (UniqueName: \"kubernetes.io/projected/eda30b1a-96f0-425e-908d-4846ffe8c3bb-kube-api-access-8nj96\") pod \"ssh-known-hosts-edpm-deployment-5rj9p\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:01 crc kubenswrapper[4707]: I0218 06:19:01.977856 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:02 crc kubenswrapper[4707]: I0218 06:19:02.473191 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rj9p"] Feb 18 06:19:02 crc kubenswrapper[4707]: I0218 06:19:02.582255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" event={"ID":"eda30b1a-96f0-425e-908d-4846ffe8c3bb","Type":"ContainerStarted","Data":"04aaf22c231ff1394897fbec7b8fc55103323bbe361b8935a1bc35806ac6987c"} Feb 18 06:19:03 crc kubenswrapper[4707]: I0218 06:19:03.593040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" event={"ID":"eda30b1a-96f0-425e-908d-4846ffe8c3bb","Type":"ContainerStarted","Data":"b3e8099c091776ca4d7a784849dfa967b80c94c8e65c7869c4d57b9179944685"} Feb 18 06:19:03 crc kubenswrapper[4707]: I0218 06:19:03.610519 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" podStartSLOduration=2.164026531 podStartE2EDuration="2.610497669s" podCreationTimestamp="2026-02-18 06:19:01 +0000 UTC" firstStartedPulling="2026-02-18 06:19:02.483706176 +0000 UTC m=+1879.131665310" lastFinishedPulling="2026-02-18 06:19:02.930177314 +0000 UTC m=+1879.578136448" observedRunningTime="2026-02-18 06:19:03.607288393 +0000 UTC m=+1880.255247527" watchObservedRunningTime="2026-02-18 06:19:03.610497669 +0000 UTC m=+1880.258456823" Feb 18 06:19:10 crc kubenswrapper[4707]: I0218 06:19:10.648701 4707 generic.go:334] "Generic (PLEG): container finished" podID="eda30b1a-96f0-425e-908d-4846ffe8c3bb" containerID="b3e8099c091776ca4d7a784849dfa967b80c94c8e65c7869c4d57b9179944685" exitCode=0 Feb 18 06:19:10 crc kubenswrapper[4707]: I0218 06:19:10.648827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" event={"ID":"eda30b1a-96f0-425e-908d-4846ffe8c3bb","Type":"ContainerDied","Data":"b3e8099c091776ca4d7a784849dfa967b80c94c8e65c7869c4d57b9179944685"} Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.051809 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.209116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-inventory-0\") pod \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.209215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-ssh-key-openstack-edpm-ipam\") pod \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.209367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nj96\" (UniqueName: \"kubernetes.io/projected/eda30b1a-96f0-425e-908d-4846ffe8c3bb-kube-api-access-8nj96\") pod \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\" (UID: \"eda30b1a-96f0-425e-908d-4846ffe8c3bb\") " Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.219021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda30b1a-96f0-425e-908d-4846ffe8c3bb-kube-api-access-8nj96" (OuterVolumeSpecName: "kube-api-access-8nj96") pod "eda30b1a-96f0-425e-908d-4846ffe8c3bb" (UID: "eda30b1a-96f0-425e-908d-4846ffe8c3bb"). InnerVolumeSpecName "kube-api-access-8nj96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.237358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "eda30b1a-96f0-425e-908d-4846ffe8c3bb" (UID: "eda30b1a-96f0-425e-908d-4846ffe8c3bb"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.239430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eda30b1a-96f0-425e-908d-4846ffe8c3bb" (UID: "eda30b1a-96f0-425e-908d-4846ffe8c3bb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.311615 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nj96\" (UniqueName: \"kubernetes.io/projected/eda30b1a-96f0-425e-908d-4846ffe8c3bb-kube-api-access-8nj96\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.311659 4707 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.311679 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda30b1a-96f0-425e-908d-4846ffe8c3bb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.669458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" event={"ID":"eda30b1a-96f0-425e-908d-4846ffe8c3bb","Type":"ContainerDied","Data":"04aaf22c231ff1394897fbec7b8fc55103323bbe361b8935a1bc35806ac6987c"} Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.669880 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04aaf22c231ff1394897fbec7b8fc55103323bbe361b8935a1bc35806ac6987c" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.669526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rj9p" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.767905 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz"] Feb 18 06:19:12 crc kubenswrapper[4707]: E0218 06:19:12.768445 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda30b1a-96f0-425e-908d-4846ffe8c3bb" containerName="ssh-known-hosts-edpm-deployment" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.768462 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda30b1a-96f0-425e-908d-4846ffe8c3bb" containerName="ssh-known-hosts-edpm-deployment" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.768725 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda30b1a-96f0-425e-908d-4846ffe8c3bb" containerName="ssh-known-hosts-edpm-deployment" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.769762 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.772736 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.773043 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.773334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.773632 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.776790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz"] Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.924986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.925402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:12 crc kubenswrapper[4707]: I0218 06:19:12.925525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7xx4\" (UniqueName: \"kubernetes.io/projected/72503a5f-0b97-4eee-b0d1-7f9621b6917c-kube-api-access-w7xx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.027213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.027531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7xx4\" (UniqueName: \"kubernetes.io/projected/72503a5f-0b97-4eee-b0d1-7f9621b6917c-kube-api-access-w7xx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.027689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.033465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.034488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.048035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7xx4\" (UniqueName: \"kubernetes.io/projected/72503a5f-0b97-4eee-b0d1-7f9621b6917c-kube-api-access-w7xx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w7kvz\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.104722 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.649947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz"] Feb 18 06:19:13 crc kubenswrapper[4707]: I0218 06:19:13.679540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" event={"ID":"72503a5f-0b97-4eee-b0d1-7f9621b6917c","Type":"ContainerStarted","Data":"72f2b98194bfdd01622a667ff591be5b20087d9e8da50a623d6f329751e2efe4"} Feb 18 06:19:14 crc kubenswrapper[4707]: I0218 06:19:14.692306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" event={"ID":"72503a5f-0b97-4eee-b0d1-7f9621b6917c","Type":"ContainerStarted","Data":"81d0cea807ea1de769e6c994fdc69f5c54b8f67cf9e031412413cee29824aa1a"} Feb 18 06:19:14 crc kubenswrapper[4707]: I0218 06:19:14.712701 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" podStartSLOduration=2.22824847 podStartE2EDuration="2.712648539s" podCreationTimestamp="2026-02-18 06:19:12 +0000 UTC" firstStartedPulling="2026-02-18 06:19:13.657265063 +0000 UTC m=+1890.305224197" lastFinishedPulling="2026-02-18 06:19:14.141665132 +0000 UTC m=+1890.789624266" observedRunningTime="2026-02-18 06:19:14.711400555 +0000 UTC m=+1891.359359679" watchObservedRunningTime="2026-02-18 06:19:14.712648539 +0000 UTC m=+1891.360607713" Feb 18 06:19:22 crc kubenswrapper[4707]: I0218 06:19:22.758019 4707 generic.go:334] "Generic (PLEG): container finished" podID="72503a5f-0b97-4eee-b0d1-7f9621b6917c" containerID="81d0cea807ea1de769e6c994fdc69f5c54b8f67cf9e031412413cee29824aa1a" exitCode=0 Feb 18 06:19:22 crc kubenswrapper[4707]: I0218 06:19:22.758228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" event={"ID":"72503a5f-0b97-4eee-b0d1-7f9621b6917c","Type":"ContainerDied","Data":"81d0cea807ea1de769e6c994fdc69f5c54b8f67cf9e031412413cee29824aa1a"} Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.179443 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.301212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-ssh-key-openstack-edpm-ipam\") pod \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.301641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7xx4\" (UniqueName: \"kubernetes.io/projected/72503a5f-0b97-4eee-b0d1-7f9621b6917c-kube-api-access-w7xx4\") pod \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.301670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-inventory\") pod \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\" (UID: \"72503a5f-0b97-4eee-b0d1-7f9621b6917c\") " Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.313997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72503a5f-0b97-4eee-b0d1-7f9621b6917c-kube-api-access-w7xx4" (OuterVolumeSpecName: "kube-api-access-w7xx4") pod "72503a5f-0b97-4eee-b0d1-7f9621b6917c" (UID: "72503a5f-0b97-4eee-b0d1-7f9621b6917c"). InnerVolumeSpecName "kube-api-access-w7xx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.336055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-inventory" (OuterVolumeSpecName: "inventory") pod "72503a5f-0b97-4eee-b0d1-7f9621b6917c" (UID: "72503a5f-0b97-4eee-b0d1-7f9621b6917c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.345449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "72503a5f-0b97-4eee-b0d1-7f9621b6917c" (UID: "72503a5f-0b97-4eee-b0d1-7f9621b6917c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.404714 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7xx4\" (UniqueName: \"kubernetes.io/projected/72503a5f-0b97-4eee-b0d1-7f9621b6917c-kube-api-access-w7xx4\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.404746 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.404756 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72503a5f-0b97-4eee-b0d1-7f9621b6917c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.779895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" event={"ID":"72503a5f-0b97-4eee-b0d1-7f9621b6917c","Type":"ContainerDied","Data":"72f2b98194bfdd01622a667ff591be5b20087d9e8da50a623d6f329751e2efe4"} Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.779959 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f2b98194bfdd01622a667ff591be5b20087d9e8da50a623d6f329751e2efe4" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.780037 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w7kvz" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.865159 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8"] Feb 18 06:19:24 crc kubenswrapper[4707]: E0218 06:19:24.865725 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72503a5f-0b97-4eee-b0d1-7f9621b6917c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.865743 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="72503a5f-0b97-4eee-b0d1-7f9621b6917c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.865938 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="72503a5f-0b97-4eee-b0d1-7f9621b6917c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.866506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.868563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.868720 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.868789 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.872013 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:19:24 crc kubenswrapper[4707]: I0218 06:19:24.881455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8"] Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.016586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.016825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.016872 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grtwx\" (UniqueName: \"kubernetes.io/projected/e81fc37d-6fb1-4a43-b632-cec42f602002-kube-api-access-grtwx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.118058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.118613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grtwx\" (UniqueName: \"kubernetes.io/projected/e81fc37d-6fb1-4a43-b632-cec42f602002-kube-api-access-grtwx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.118786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.124845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.124854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.136121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grtwx\" (UniqueName: \"kubernetes.io/projected/e81fc37d-6fb1-4a43-b632-cec42f602002-kube-api-access-grtwx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.184885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.677062 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8"] Feb 18 06:19:25 crc kubenswrapper[4707]: I0218 06:19:25.789485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" event={"ID":"e81fc37d-6fb1-4a43-b632-cec42f602002","Type":"ContainerStarted","Data":"1dd957384a30c5b7f794b3b38b9628c1daa8c3eaddeeabf62ee888eda8e37154"} Feb 18 06:19:26 crc kubenswrapper[4707]: I0218 06:19:26.798218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" event={"ID":"e81fc37d-6fb1-4a43-b632-cec42f602002","Type":"ContainerStarted","Data":"2d8a5ca1bcc89d676eea2571523542de2316330f0d7addeba32e9395f74bb309"} Feb 18 06:19:26 crc kubenswrapper[4707]: I0218 06:19:26.823568 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" podStartSLOduration=2.29235963 podStartE2EDuration="2.823543032s" podCreationTimestamp="2026-02-18 06:19:24 +0000 UTC" firstStartedPulling="2026-02-18 06:19:25.680920951 +0000 UTC m=+1902.328880085" lastFinishedPulling="2026-02-18 06:19:26.212104353 +0000 UTC m=+1902.860063487" observedRunningTime="2026-02-18 06:19:26.814515648 +0000 UTC m=+1903.462474792" watchObservedRunningTime="2026-02-18 06:19:26.823543032 +0000 UTC m=+1903.471502166" Feb 18 06:19:36 crc kubenswrapper[4707]: I0218 06:19:36.884479 4707 generic.go:334] "Generic (PLEG): container finished" podID="e81fc37d-6fb1-4a43-b632-cec42f602002" containerID="2d8a5ca1bcc89d676eea2571523542de2316330f0d7addeba32e9395f74bb309" exitCode=0 Feb 18 06:19:36 crc kubenswrapper[4707]: I0218 06:19:36.884577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" event={"ID":"e81fc37d-6fb1-4a43-b632-cec42f602002","Type":"ContainerDied","Data":"2d8a5ca1bcc89d676eea2571523542de2316330f0d7addeba32e9395f74bb309"} Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.361241 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.478249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grtwx\" (UniqueName: \"kubernetes.io/projected/e81fc37d-6fb1-4a43-b632-cec42f602002-kube-api-access-grtwx\") pod \"e81fc37d-6fb1-4a43-b632-cec42f602002\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.479294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-ssh-key-openstack-edpm-ipam\") pod \"e81fc37d-6fb1-4a43-b632-cec42f602002\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.479361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-inventory\") pod \"e81fc37d-6fb1-4a43-b632-cec42f602002\" (UID: \"e81fc37d-6fb1-4a43-b632-cec42f602002\") " Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.484593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81fc37d-6fb1-4a43-b632-cec42f602002-kube-api-access-grtwx" (OuterVolumeSpecName: "kube-api-access-grtwx") pod "e81fc37d-6fb1-4a43-b632-cec42f602002" (UID: "e81fc37d-6fb1-4a43-b632-cec42f602002"). InnerVolumeSpecName "kube-api-access-grtwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.506163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-inventory" (OuterVolumeSpecName: "inventory") pod "e81fc37d-6fb1-4a43-b632-cec42f602002" (UID: "e81fc37d-6fb1-4a43-b632-cec42f602002"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.514165 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e81fc37d-6fb1-4a43-b632-cec42f602002" (UID: "e81fc37d-6fb1-4a43-b632-cec42f602002"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.583698 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grtwx\" (UniqueName: \"kubernetes.io/projected/e81fc37d-6fb1-4a43-b632-cec42f602002-kube-api-access-grtwx\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.583769 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.583826 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81fc37d-6fb1-4a43-b632-cec42f602002-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.920612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" event={"ID":"e81fc37d-6fb1-4a43-b632-cec42f602002","Type":"ContainerDied","Data":"1dd957384a30c5b7f794b3b38b9628c1daa8c3eaddeeabf62ee888eda8e37154"} Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.920692 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd957384a30c5b7f794b3b38b9628c1daa8c3eaddeeabf62ee888eda8e37154" Feb 18 06:19:38 crc kubenswrapper[4707]: I0218 06:19:38.921442 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.077041 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6"] Feb 18 06:19:39 crc kubenswrapper[4707]: E0218 06:19:39.077844 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81fc37d-6fb1-4a43-b632-cec42f602002" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.077866 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81fc37d-6fb1-4a43-b632-cec42f602002" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.078154 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81fc37d-6fb1-4a43-b632-cec42f602002" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.078999 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.084372 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.084663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.084976 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.085029 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.085146 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.085361 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.085461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.096053 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.105433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6"] Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hd8\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-kube-api-access-f5hd8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.197712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.198629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hd8\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-kube-api-access-f5hd8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.302793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.309522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.309949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.310864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.311675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.311843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.312401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.313418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.314834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.315567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.315891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.319897 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.320087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.320473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.323030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hd8\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-kube-api-access-f5hd8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rvws6\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:39 crc kubenswrapper[4707]: I0218 06:19:39.404174 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:19:40 crc kubenswrapper[4707]: I0218 06:19:40.006148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6"] Feb 18 06:19:40 crc kubenswrapper[4707]: I0218 06:19:40.941063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" event={"ID":"89235767-8eea-43b0-9b2e-cf7fc766a260","Type":"ContainerStarted","Data":"2f82afc19f860f2ba8e899bf04e3512a55521b089ea3bf63eb75a8303e42632c"} Feb 18 06:19:40 crc kubenswrapper[4707]: I0218 06:19:40.941904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" event={"ID":"89235767-8eea-43b0-9b2e-cf7fc766a260","Type":"ContainerStarted","Data":"201d8cfddfcd44d6aaa43fea09f48f967f746dece355d4f7ad23d00dfeba18ee"} Feb 18 06:19:40 crc kubenswrapper[4707]: I0218 06:19:40.967282 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" podStartSLOduration=1.5652552370000001 podStartE2EDuration="1.967256311s" podCreationTimestamp="2026-02-18 06:19:39 +0000 UTC" firstStartedPulling="2026-02-18 06:19:40.016866541 +0000 UTC m=+1916.664825675" lastFinishedPulling="2026-02-18 06:19:40.418867595 +0000 UTC m=+1917.066826749" observedRunningTime="2026-02-18 06:19:40.958638859 +0000 UTC m=+1917.606597993" watchObservedRunningTime="2026-02-18 06:19:40.967256311 +0000 UTC m=+1917.615215445" Feb 18 06:19:43 crc kubenswrapper[4707]: I0218 06:19:43.074469 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-grmw2"] Feb 18 06:19:43 crc kubenswrapper[4707]: I0218 06:19:43.088628 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-grmw2"] Feb 18 06:19:44 crc kubenswrapper[4707]: I0218 06:19:44.065043 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362e99d9-d597-4698-ba43-339a153a5ff9" path="/var/lib/kubelet/pods/362e99d9-d597-4698-ba43-339a153a5ff9/volumes" Feb 18 06:19:49 crc kubenswrapper[4707]: I0218 06:19:49.364605 4707 scope.go:117] "RemoveContainer" containerID="130dfe5eb55c2d592f127c6835ef2eb0ed35bc28813e06ad274b0a7f679da466" Feb 18 06:19:49 crc kubenswrapper[4707]: I0218 06:19:49.422076 4707 scope.go:117] "RemoveContainer" containerID="e4b127bfa70a9eae84101e91000f1fd1f05168390c49d6ed38a8bf7881f9e6c8" Feb 18 06:19:49 crc kubenswrapper[4707]: I0218 06:19:49.493619 4707 scope.go:117] "RemoveContainer" containerID="90dcd35fe507051c541b71b419ed8fd7f40936d24218c93090ff354907cea00f" Feb 18 06:19:51 crc kubenswrapper[4707]: I0218 06:19:51.381790 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:19:51 crc kubenswrapper[4707]: I0218 06:19:51.382197 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:20:19 crc kubenswrapper[4707]: I0218 06:20:19.289286 4707 generic.go:334] "Generic (PLEG): container finished" podID="89235767-8eea-43b0-9b2e-cf7fc766a260" containerID="2f82afc19f860f2ba8e899bf04e3512a55521b089ea3bf63eb75a8303e42632c" exitCode=0 Feb 18 06:20:19 crc kubenswrapper[4707]: I0218 06:20:19.289371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" event={"ID":"89235767-8eea-43b0-9b2e-cf7fc766a260","Type":"ContainerDied","Data":"2f82afc19f860f2ba8e899bf04e3512a55521b089ea3bf63eb75a8303e42632c"} Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.741300 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.838865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-repo-setup-combined-ca-bundle\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-ovn-default-certs-0\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ssh-key-openstack-edpm-ipam\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-libvirt-combined-ca-bundle\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-inventory\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-telemetry-combined-ca-bundle\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5hd8\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-kube-api-access-f5hd8\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ovn-combined-ca-bundle\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-bootstrap-combined-ca-bundle\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839548 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-neutron-metadata-combined-ca-bundle\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.839591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-nova-combined-ca-bundle\") pod \"89235767-8eea-43b0-9b2e-cf7fc766a260\" (UID: \"89235767-8eea-43b0-9b2e-cf7fc766a260\") " Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.845111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.845489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.846023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.846054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.846195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.846264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.847983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.848419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.848509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-kube-api-access-f5hd8" (OuterVolumeSpecName: "kube-api-access-f5hd8") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "kube-api-access-f5hd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.848849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.850345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.857701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.876323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.885123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-inventory" (OuterVolumeSpecName: "inventory") pod "89235767-8eea-43b0-9b2e-cf7fc766a260" (UID: "89235767-8eea-43b0-9b2e-cf7fc766a260"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941470 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941509 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941520 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941533 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941542 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941553 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941563 4707 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941572 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941583 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5hd8\" (UniqueName: \"kubernetes.io/projected/89235767-8eea-43b0-9b2e-cf7fc766a260-kube-api-access-f5hd8\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941592 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941600 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941608 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941617 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:20 crc kubenswrapper[4707]: I0218 06:20:20.941626 4707 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89235767-8eea-43b0-9b2e-cf7fc766a260-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.309529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" event={"ID":"89235767-8eea-43b0-9b2e-cf7fc766a260","Type":"ContainerDied","Data":"201d8cfddfcd44d6aaa43fea09f48f967f746dece355d4f7ad23d00dfeba18ee"} Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.309584 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="201d8cfddfcd44d6aaa43fea09f48f967f746dece355d4f7ad23d00dfeba18ee" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.309613 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rvws6" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.382786 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.382861 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.482077 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh"] Feb 18 06:20:21 crc kubenswrapper[4707]: E0218 06:20:21.482583 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89235767-8eea-43b0-9b2e-cf7fc766a260" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.482606 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89235767-8eea-43b0-9b2e-cf7fc766a260" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.482879 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="89235767-8eea-43b0-9b2e-cf7fc766a260" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.483610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.486911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.487006 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.487324 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.487350 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.487619 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.507681 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh"] Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.553947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.554030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbln\" (UniqueName: \"kubernetes.io/projected/09e0f47e-9057-4b18-ba9a-41b34b4fe425-kube-api-access-fkbln\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.554076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.554563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.554659 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.656341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.656514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.657180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbln\" (UniqueName: \"kubernetes.io/projected/09e0f47e-9057-4b18-ba9a-41b34b4fe425-kube-api-access-fkbln\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.657258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.657314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.657778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.661256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.661962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.662607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.674613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbln\" (UniqueName: \"kubernetes.io/projected/09e0f47e-9057-4b18-ba9a-41b34b4fe425-kube-api-access-fkbln\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xvlwh\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:21 crc kubenswrapper[4707]: I0218 06:20:21.802653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:20:22 crc kubenswrapper[4707]: I0218 06:20:22.315450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh"] Feb 18 06:20:23 crc kubenswrapper[4707]: I0218 06:20:23.337609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" event={"ID":"09e0f47e-9057-4b18-ba9a-41b34b4fe425","Type":"ContainerStarted","Data":"b7c23762ae9b1449107852e402bfd874f33de203c9b28260b701d1d4247fdfbf"} Feb 18 06:20:23 crc kubenswrapper[4707]: I0218 06:20:23.338214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" event={"ID":"09e0f47e-9057-4b18-ba9a-41b34b4fe425","Type":"ContainerStarted","Data":"3f3cf1b14742411764c43472a532048fd104e1831ffe168f48aeed9ac0e24240"} Feb 18 06:20:23 crc kubenswrapper[4707]: I0218 06:20:23.367000 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" podStartSLOduration=1.937972406 podStartE2EDuration="2.366981459s" podCreationTimestamp="2026-02-18 06:20:21 +0000 UTC" firstStartedPulling="2026-02-18 06:20:22.323379092 +0000 UTC m=+1958.971338246" lastFinishedPulling="2026-02-18 06:20:22.752388165 +0000 UTC m=+1959.400347299" observedRunningTime="2026-02-18 06:20:23.358928642 +0000 UTC m=+1960.006887786" watchObservedRunningTime="2026-02-18 06:20:23.366981459 +0000 UTC m=+1960.014940603" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.730868 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-stj27"] Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.734852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.741341 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-stj27"] Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.825525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-catalog-content\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.825649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-utilities\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.825714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdv5x\" (UniqueName: \"kubernetes.io/projected/678dd54b-96f3-4e3e-8888-b7b09a99ddea-kube-api-access-vdv5x\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.927932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdv5x\" (UniqueName: \"kubernetes.io/projected/678dd54b-96f3-4e3e-8888-b7b09a99ddea-kube-api-access-vdv5x\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.928078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-catalog-content\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.928166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-utilities\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.928659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-catalog-content\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.928718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-utilities\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:39 crc kubenswrapper[4707]: I0218 06:20:39.946949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdv5x\" (UniqueName: \"kubernetes.io/projected/678dd54b-96f3-4e3e-8888-b7b09a99ddea-kube-api-access-vdv5x\") pod \"community-operators-stj27\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:40 crc kubenswrapper[4707]: I0218 06:20:40.072219 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:40 crc kubenswrapper[4707]: I0218 06:20:40.611420 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-stj27"] Feb 18 06:20:41 crc kubenswrapper[4707]: I0218 06:20:41.488863 4707 generic.go:334] "Generic (PLEG): container finished" podID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerID="34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4" exitCode=0 Feb 18 06:20:41 crc kubenswrapper[4707]: I0218 06:20:41.488908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stj27" event={"ID":"678dd54b-96f3-4e3e-8888-b7b09a99ddea","Type":"ContainerDied","Data":"34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4"} Feb 18 06:20:41 crc kubenswrapper[4707]: I0218 06:20:41.489244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stj27" event={"ID":"678dd54b-96f3-4e3e-8888-b7b09a99ddea","Type":"ContainerStarted","Data":"684b61a17eb41a699851b0921df8e04ddc90252b3f3a3680470a04d45c63de69"} Feb 18 06:20:42 crc kubenswrapper[4707]: I0218 06:20:42.499036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stj27" event={"ID":"678dd54b-96f3-4e3e-8888-b7b09a99ddea","Type":"ContainerStarted","Data":"2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f"} Feb 18 06:20:43 crc kubenswrapper[4707]: I0218 06:20:43.509773 4707 generic.go:334] "Generic (PLEG): container finished" podID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerID="2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f" exitCode=0 Feb 18 06:20:43 crc kubenswrapper[4707]: I0218 06:20:43.509841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stj27" event={"ID":"678dd54b-96f3-4e3e-8888-b7b09a99ddea","Type":"ContainerDied","Data":"2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f"} Feb 18 06:20:44 crc kubenswrapper[4707]: I0218 06:20:44.525026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stj27" event={"ID":"678dd54b-96f3-4e3e-8888-b7b09a99ddea","Type":"ContainerStarted","Data":"ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a"} Feb 18 06:20:44 crc kubenswrapper[4707]: I0218 06:20:44.553500 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-stj27" podStartSLOduration=3.010668137 podStartE2EDuration="5.553481733s" podCreationTimestamp="2026-02-18 06:20:39 +0000 UTC" firstStartedPulling="2026-02-18 06:20:41.491422438 +0000 UTC m=+1978.139381572" lastFinishedPulling="2026-02-18 06:20:44.034236034 +0000 UTC m=+1980.682195168" observedRunningTime="2026-02-18 06:20:44.544974564 +0000 UTC m=+1981.192933708" watchObservedRunningTime="2026-02-18 06:20:44.553481733 +0000 UTC m=+1981.201440867" Feb 18 06:20:50 crc kubenswrapper[4707]: I0218 06:20:50.072990 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:50 crc kubenswrapper[4707]: I0218 06:20:50.074937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:50 crc kubenswrapper[4707]: I0218 06:20:50.140604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:50 crc kubenswrapper[4707]: I0218 06:20:50.638163 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:50 crc kubenswrapper[4707]: I0218 06:20:50.686052 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-stj27"] Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.382248 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.382315 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.382364 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.383125 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b793e97ac0e534321a4fdc530604143a86bad11d81b78baa1a5c35dfbdc0cbf8"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.383169 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://b793e97ac0e534321a4fdc530604143a86bad11d81b78baa1a5c35dfbdc0cbf8" gracePeriod=600 Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.597438 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="b793e97ac0e534321a4fdc530604143a86bad11d81b78baa1a5c35dfbdc0cbf8" exitCode=0 Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.597554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"b793e97ac0e534321a4fdc530604143a86bad11d81b78baa1a5c35dfbdc0cbf8"} Feb 18 06:20:51 crc kubenswrapper[4707]: I0218 06:20:51.598221 4707 scope.go:117] "RemoveContainer" containerID="16d4676a6da8433e4f372e978b8abcb0debebf1bb4ebc90e9c277e5ed14ec9af" Feb 18 06:20:52 crc kubenswrapper[4707]: I0218 06:20:52.609208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820"} Feb 18 06:20:52 crc kubenswrapper[4707]: I0218 06:20:52.609325 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-stj27" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="registry-server" containerID="cri-o://ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a" gracePeriod=2 Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.184131 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.323716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-utilities\") pod \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.324158 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-catalog-content\") pod \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.324257 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdv5x\" (UniqueName: \"kubernetes.io/projected/678dd54b-96f3-4e3e-8888-b7b09a99ddea-kube-api-access-vdv5x\") pod \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\" (UID: \"678dd54b-96f3-4e3e-8888-b7b09a99ddea\") " Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.324498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-utilities" (OuterVolumeSpecName: "utilities") pod "678dd54b-96f3-4e3e-8888-b7b09a99ddea" (UID: "678dd54b-96f3-4e3e-8888-b7b09a99ddea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.324902 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.330709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678dd54b-96f3-4e3e-8888-b7b09a99ddea-kube-api-access-vdv5x" (OuterVolumeSpecName: "kube-api-access-vdv5x") pod "678dd54b-96f3-4e3e-8888-b7b09a99ddea" (UID: "678dd54b-96f3-4e3e-8888-b7b09a99ddea"). InnerVolumeSpecName "kube-api-access-vdv5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.376470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "678dd54b-96f3-4e3e-8888-b7b09a99ddea" (UID: "678dd54b-96f3-4e3e-8888-b7b09a99ddea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.427355 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678dd54b-96f3-4e3e-8888-b7b09a99ddea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.427395 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdv5x\" (UniqueName: \"kubernetes.io/projected/678dd54b-96f3-4e3e-8888-b7b09a99ddea-kube-api-access-vdv5x\") on node \"crc\" DevicePath \"\"" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.619675 4707 generic.go:334] "Generic (PLEG): container finished" podID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerID="ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a" exitCode=0 Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.619769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stj27" event={"ID":"678dd54b-96f3-4e3e-8888-b7b09a99ddea","Type":"ContainerDied","Data":"ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a"} Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.619842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stj27" event={"ID":"678dd54b-96f3-4e3e-8888-b7b09a99ddea","Type":"ContainerDied","Data":"684b61a17eb41a699851b0921df8e04ddc90252b3f3a3680470a04d45c63de69"} Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.619866 4707 scope.go:117] "RemoveContainer" containerID="ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.620884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stj27" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.649990 4707 scope.go:117] "RemoveContainer" containerID="2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.662465 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-stj27"] Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.670302 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-stj27"] Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.672247 4707 scope.go:117] "RemoveContainer" containerID="34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.726325 4707 scope.go:117] "RemoveContainer" containerID="ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a" Feb 18 06:20:53 crc kubenswrapper[4707]: E0218 06:20:53.726839 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a\": container with ID starting with ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a not found: ID does not exist" containerID="ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.726876 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a"} err="failed to get container status \"ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a\": rpc error: code = NotFound desc = could not find container \"ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a\": container with ID starting with ed1fec44d90636d2b6051929b34b26a280e2aec7b21f20bca2987b12c4d1d48a not found: ID does not exist" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.726903 4707 scope.go:117] "RemoveContainer" containerID="2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f" Feb 18 06:20:53 crc kubenswrapper[4707]: E0218 06:20:53.728012 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f\": container with ID starting with 2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f not found: ID does not exist" containerID="2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.728064 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f"} err="failed to get container status \"2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f\": rpc error: code = NotFound desc = could not find container \"2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f\": container with ID starting with 2e0f06a4dd43f731251b863045c217736450d473d27b7207b263de6d6b3c477f not found: ID does not exist" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.728098 4707 scope.go:117] "RemoveContainer" containerID="34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4" Feb 18 06:20:53 crc kubenswrapper[4707]: E0218 06:20:53.728535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4\": container with ID starting with 34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4 not found: ID does not exist" containerID="34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4" Feb 18 06:20:53 crc kubenswrapper[4707]: I0218 06:20:53.728557 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4"} err="failed to get container status \"34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4\": rpc error: code = NotFound desc = could not find container \"34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4\": container with ID starting with 34afd8c1cb96d0a00aec872a0ebf817b675ea2a982c6ac2d0ab67ab6b96df6b4 not found: ID does not exist" Feb 18 06:20:54 crc kubenswrapper[4707]: I0218 06:20:54.074028 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" path="/var/lib/kubelet/pods/678dd54b-96f3-4e3e-8888-b7b09a99ddea/volumes" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.795127 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wjt42"] Feb 18 06:20:55 crc kubenswrapper[4707]: E0218 06:20:55.796015 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="registry-server" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.796026 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="registry-server" Feb 18 06:20:55 crc kubenswrapper[4707]: E0218 06:20:55.796043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="extract-utilities" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.796050 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="extract-utilities" Feb 18 06:20:55 crc kubenswrapper[4707]: E0218 06:20:55.796078 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="extract-content" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.796085 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="extract-content" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.796257 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="678dd54b-96f3-4e3e-8888-b7b09a99ddea" containerName="registry-server" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.797645 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.827510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjt42"] Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.873206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-utilities\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.873259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-catalog-content\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.873313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gpq4\" (UniqueName: \"kubernetes.io/projected/dec791c5-5ec0-49ae-8528-49002125ec7d-kube-api-access-2gpq4\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.974759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gpq4\" (UniqueName: \"kubernetes.io/projected/dec791c5-5ec0-49ae-8528-49002125ec7d-kube-api-access-2gpq4\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.974977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-utilities\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.975003 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-catalog-content\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.975449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-catalog-content\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.975585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-utilities\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:55 crc kubenswrapper[4707]: I0218 06:20:55.995617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gpq4\" (UniqueName: \"kubernetes.io/projected/dec791c5-5ec0-49ae-8528-49002125ec7d-kube-api-access-2gpq4\") pod \"redhat-marketplace-wjt42\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:56 crc kubenswrapper[4707]: I0218 06:20:56.159247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:20:58 crc kubenswrapper[4707]: I0218 06:20:58.686401 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-79d975b745-8gngt" podUID="8ed2f5cf-84b8-4a09-b76f-a60bcb055a04" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 18 06:20:59 crc kubenswrapper[4707]: I0218 06:20:59.608886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjt42"] Feb 18 06:21:00 crc kubenswrapper[4707]: I0218 06:21:00.507702 4707 generic.go:334] "Generic (PLEG): container finished" podID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerID="f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1" exitCode=0 Feb 18 06:21:00 crc kubenswrapper[4707]: I0218 06:21:00.508103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjt42" event={"ID":"dec791c5-5ec0-49ae-8528-49002125ec7d","Type":"ContainerDied","Data":"f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1"} Feb 18 06:21:00 crc kubenswrapper[4707]: I0218 06:21:00.508444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjt42" event={"ID":"dec791c5-5ec0-49ae-8528-49002125ec7d","Type":"ContainerStarted","Data":"ed2bfe67d54968dbc5c659a54bf0d8faaa0f8a95b9b1b318d9f37da18d71cf90"} Feb 18 06:21:01 crc kubenswrapper[4707]: I0218 06:21:01.520602 4707 generic.go:334] "Generic (PLEG): container finished" podID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerID="b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da" exitCode=0 Feb 18 06:21:01 crc kubenswrapper[4707]: I0218 06:21:01.521438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjt42" event={"ID":"dec791c5-5ec0-49ae-8528-49002125ec7d","Type":"ContainerDied","Data":"b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da"} Feb 18 06:21:02 crc kubenswrapper[4707]: I0218 06:21:02.534721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjt42" event={"ID":"dec791c5-5ec0-49ae-8528-49002125ec7d","Type":"ContainerStarted","Data":"c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf"} Feb 18 06:21:02 crc kubenswrapper[4707]: I0218 06:21:02.564878 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wjt42" podStartSLOduration=6.170290915 podStartE2EDuration="7.564850298s" podCreationTimestamp="2026-02-18 06:20:55 +0000 UTC" firstStartedPulling="2026-02-18 06:21:00.510671096 +0000 UTC m=+1997.158630240" lastFinishedPulling="2026-02-18 06:21:01.905230489 +0000 UTC m=+1998.553189623" observedRunningTime="2026-02-18 06:21:02.55160404 +0000 UTC m=+1999.199563184" watchObservedRunningTime="2026-02-18 06:21:02.564850298 +0000 UTC m=+1999.212809432" Feb 18 06:21:06 crc kubenswrapper[4707]: I0218 06:21:06.159510 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:21:06 crc kubenswrapper[4707]: I0218 06:21:06.160070 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:21:06 crc kubenswrapper[4707]: I0218 06:21:06.212207 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:21:16 crc kubenswrapper[4707]: I0218 06:21:16.209750 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:21:16 crc kubenswrapper[4707]: I0218 06:21:16.270300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjt42"] Feb 18 06:21:16 crc kubenswrapper[4707]: I0218 06:21:16.659395 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wjt42" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="registry-server" containerID="cri-o://c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf" gracePeriod=2 Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.109627 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.252856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gpq4\" (UniqueName: \"kubernetes.io/projected/dec791c5-5ec0-49ae-8528-49002125ec7d-kube-api-access-2gpq4\") pod \"dec791c5-5ec0-49ae-8528-49002125ec7d\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.253044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-utilities\") pod \"dec791c5-5ec0-49ae-8528-49002125ec7d\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.253140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-catalog-content\") pod \"dec791c5-5ec0-49ae-8528-49002125ec7d\" (UID: \"dec791c5-5ec0-49ae-8528-49002125ec7d\") " Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.253790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-utilities" (OuterVolumeSpecName: "utilities") pod "dec791c5-5ec0-49ae-8528-49002125ec7d" (UID: "dec791c5-5ec0-49ae-8528-49002125ec7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.258053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec791c5-5ec0-49ae-8528-49002125ec7d-kube-api-access-2gpq4" (OuterVolumeSpecName: "kube-api-access-2gpq4") pod "dec791c5-5ec0-49ae-8528-49002125ec7d" (UID: "dec791c5-5ec0-49ae-8528-49002125ec7d"). InnerVolumeSpecName "kube-api-access-2gpq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.298832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dec791c5-5ec0-49ae-8528-49002125ec7d" (UID: "dec791c5-5ec0-49ae-8528-49002125ec7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.355089 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gpq4\" (UniqueName: \"kubernetes.io/projected/dec791c5-5ec0-49ae-8528-49002125ec7d-kube-api-access-2gpq4\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.355123 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.355133 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec791c5-5ec0-49ae-8528-49002125ec7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.670801 4707 generic.go:334] "Generic (PLEG): container finished" podID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerID="c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf" exitCode=0 Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.670841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjt42" event={"ID":"dec791c5-5ec0-49ae-8528-49002125ec7d","Type":"ContainerDied","Data":"c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf"} Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.670867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjt42" event={"ID":"dec791c5-5ec0-49ae-8528-49002125ec7d","Type":"ContainerDied","Data":"ed2bfe67d54968dbc5c659a54bf0d8faaa0f8a95b9b1b318d9f37da18d71cf90"} Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.670883 4707 scope.go:117] "RemoveContainer" containerID="c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.670929 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjt42" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.698435 4707 scope.go:117] "RemoveContainer" containerID="b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.729116 4707 scope.go:117] "RemoveContainer" containerID="f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.737385 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjt42"] Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.748351 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjt42"] Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.779695 4707 scope.go:117] "RemoveContainer" containerID="c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf" Feb 18 06:21:17 crc kubenswrapper[4707]: E0218 06:21:17.780302 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf\": container with ID starting with c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf not found: ID does not exist" containerID="c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.780343 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf"} err="failed to get container status \"c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf\": rpc error: code = NotFound desc = could not find container \"c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf\": container with ID starting with c952f45e2bda86cb5a8c0a0038eb059ba87e6986d140f7b866d29b3b0b7ac9bf not found: ID does not exist" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.780370 4707 scope.go:117] "RemoveContainer" containerID="b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da" Feb 18 06:21:17 crc kubenswrapper[4707]: E0218 06:21:17.781269 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da\": container with ID starting with b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da not found: ID does not exist" containerID="b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.781293 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da"} err="failed to get container status \"b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da\": rpc error: code = NotFound desc = could not find container \"b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da\": container with ID starting with b59f79203a46b87e61cf62b2e46ac253a18cb4ac47d3d80283810729696de4da not found: ID does not exist" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.781310 4707 scope.go:117] "RemoveContainer" containerID="f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1" Feb 18 06:21:17 crc kubenswrapper[4707]: E0218 06:21:17.789351 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1\": container with ID starting with f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1 not found: ID does not exist" containerID="f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1" Feb 18 06:21:17 crc kubenswrapper[4707]: I0218 06:21:17.789407 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1"} err="failed to get container status \"f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1\": rpc error: code = NotFound desc = could not find container \"f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1\": container with ID starting with f3c39af9ac032b73b22b749b2b7fa476f3dc24758a3f4d98e68e8a12ae85c9c1 not found: ID does not exist" Feb 18 06:21:18 crc kubenswrapper[4707]: I0218 06:21:18.070957 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" path="/var/lib/kubelet/pods/dec791c5-5ec0-49ae-8528-49002125ec7d/volumes" Feb 18 06:21:30 crc kubenswrapper[4707]: I0218 06:21:30.793407 4707 generic.go:334] "Generic (PLEG): container finished" podID="09e0f47e-9057-4b18-ba9a-41b34b4fe425" containerID="b7c23762ae9b1449107852e402bfd874f33de203c9b28260b701d1d4247fdfbf" exitCode=0 Feb 18 06:21:30 crc kubenswrapper[4707]: I0218 06:21:30.793525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" event={"ID":"09e0f47e-9057-4b18-ba9a-41b34b4fe425","Type":"ContainerDied","Data":"b7c23762ae9b1449107852e402bfd874f33de203c9b28260b701d1d4247fdfbf"} Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.267504 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.361553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ssh-key-openstack-edpm-ipam\") pod \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.361682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovncontroller-config-0\") pod \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.361741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkbln\" (UniqueName: \"kubernetes.io/projected/09e0f47e-9057-4b18-ba9a-41b34b4fe425-kube-api-access-fkbln\") pod \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.361836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-inventory\") pod \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.361889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovn-combined-ca-bundle\") pod \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\" (UID: \"09e0f47e-9057-4b18-ba9a-41b34b4fe425\") " Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.370918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "09e0f47e-9057-4b18-ba9a-41b34b4fe425" (UID: "09e0f47e-9057-4b18-ba9a-41b34b4fe425"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.374213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e0f47e-9057-4b18-ba9a-41b34b4fe425-kube-api-access-fkbln" (OuterVolumeSpecName: "kube-api-access-fkbln") pod "09e0f47e-9057-4b18-ba9a-41b34b4fe425" (UID: "09e0f47e-9057-4b18-ba9a-41b34b4fe425"). InnerVolumeSpecName "kube-api-access-fkbln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.405199 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "09e0f47e-9057-4b18-ba9a-41b34b4fe425" (UID: "09e0f47e-9057-4b18-ba9a-41b34b4fe425"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.415352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "09e0f47e-9057-4b18-ba9a-41b34b4fe425" (UID: "09e0f47e-9057-4b18-ba9a-41b34b4fe425"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.436044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-inventory" (OuterVolumeSpecName: "inventory") pod "09e0f47e-9057-4b18-ba9a-41b34b4fe425" (UID: "09e0f47e-9057-4b18-ba9a-41b34b4fe425"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.464998 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.465057 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.465074 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.465085 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09e0f47e-9057-4b18-ba9a-41b34b4fe425-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.465095 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkbln\" (UniqueName: \"kubernetes.io/projected/09e0f47e-9057-4b18-ba9a-41b34b4fe425-kube-api-access-fkbln\") on node \"crc\" DevicePath \"\"" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.812968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" event={"ID":"09e0f47e-9057-4b18-ba9a-41b34b4fe425","Type":"ContainerDied","Data":"3f3cf1b14742411764c43472a532048fd104e1831ffe168f48aeed9ac0e24240"} Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.813015 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f3cf1b14742411764c43472a532048fd104e1831ffe168f48aeed9ac0e24240" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.813014 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xvlwh" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.908775 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh"] Feb 18 06:21:32 crc kubenswrapper[4707]: E0218 06:21:32.909178 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e0f47e-9057-4b18-ba9a-41b34b4fe425" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.909194 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e0f47e-9057-4b18-ba9a-41b34b4fe425" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 06:21:32 crc kubenswrapper[4707]: E0218 06:21:32.909228 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="registry-server" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.909235 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="registry-server" Feb 18 06:21:32 crc kubenswrapper[4707]: E0218 06:21:32.909247 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="extract-content" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.909254 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="extract-content" Feb 18 06:21:32 crc kubenswrapper[4707]: E0218 06:21:32.909266 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="extract-utilities" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.909272 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="extract-utilities" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.909469 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec791c5-5ec0-49ae-8528-49002125ec7d" containerName="registry-server" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.909500 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09e0f47e-9057-4b18-ba9a-41b34b4fe425" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.910318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.914339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.914399 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.914340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.914580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.914678 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.914771 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:21:32 crc kubenswrapper[4707]: I0218 06:21:32.920995 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh"] Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.078134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.078185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.078262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.078330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.078412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9sw8\" (UniqueName: \"kubernetes.io/projected/cf9b64ea-e740-4b80-b899-5f856afdd9c7-kube-api-access-r9sw8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.078445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.180283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9sw8\" (UniqueName: \"kubernetes.io/projected/cf9b64ea-e740-4b80-b899-5f856afdd9c7-kube-api-access-r9sw8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.180376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.180451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.180533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.180667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.180849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.185333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.192626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.192756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.193339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.205186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.214508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9sw8\" (UniqueName: \"kubernetes.io/projected/cf9b64ea-e740-4b80-b899-5f856afdd9c7-kube-api-access-r9sw8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.279190 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:21:33 crc kubenswrapper[4707]: I0218 06:21:33.829290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh"] Feb 18 06:21:34 crc kubenswrapper[4707]: I0218 06:21:34.833509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" event={"ID":"cf9b64ea-e740-4b80-b899-5f856afdd9c7","Type":"ContainerStarted","Data":"4aead9d6427735a8dc0c68c24d231b59b78013606ad600237cc36d955df116ef"} Feb 18 06:21:34 crc kubenswrapper[4707]: I0218 06:21:34.834138 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" event={"ID":"cf9b64ea-e740-4b80-b899-5f856afdd9c7","Type":"ContainerStarted","Data":"acb6cf3988dfc24d902e7bbad3a637c801ce1622ad9e6a5dff7046053200e82a"} Feb 18 06:21:34 crc kubenswrapper[4707]: I0218 06:21:34.854535 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" podStartSLOduration=2.461240567 podStartE2EDuration="2.854508779s" podCreationTimestamp="2026-02-18 06:21:32 +0000 UTC" firstStartedPulling="2026-02-18 06:21:33.830854348 +0000 UTC m=+2030.478813482" lastFinishedPulling="2026-02-18 06:21:34.22412257 +0000 UTC m=+2030.872081694" observedRunningTime="2026-02-18 06:21:34.850585673 +0000 UTC m=+2031.498544807" watchObservedRunningTime="2026-02-18 06:21:34.854508779 +0000 UTC m=+2031.502468013" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.444451 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cczzr"] Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.449347 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.451253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9krz\" (UniqueName: \"kubernetes.io/projected/c99c9870-17dd-4735-b01d-92479be513c8-kube-api-access-g9krz\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.451334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-utilities\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.451353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-catalog-content\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.478693 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cczzr"] Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.553242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9krz\" (UniqueName: \"kubernetes.io/projected/c99c9870-17dd-4735-b01d-92479be513c8-kube-api-access-g9krz\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.553464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-utilities\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.553505 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-catalog-content\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.554083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-catalog-content\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.554079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-utilities\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.577687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9krz\" (UniqueName: \"kubernetes.io/projected/c99c9870-17dd-4735-b01d-92479be513c8-kube-api-access-g9krz\") pod \"redhat-operators-cczzr\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:46 crc kubenswrapper[4707]: I0218 06:21:46.774160 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:47 crc kubenswrapper[4707]: I0218 06:21:47.225110 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cczzr"] Feb 18 06:21:47 crc kubenswrapper[4707]: I0218 06:21:47.945783 4707 generic.go:334] "Generic (PLEG): container finished" podID="c99c9870-17dd-4735-b01d-92479be513c8" containerID="c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6" exitCode=0 Feb 18 06:21:47 crc kubenswrapper[4707]: I0218 06:21:47.945892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczzr" event={"ID":"c99c9870-17dd-4735-b01d-92479be513c8","Type":"ContainerDied","Data":"c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6"} Feb 18 06:21:47 crc kubenswrapper[4707]: I0218 06:21:47.946103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczzr" event={"ID":"c99c9870-17dd-4735-b01d-92479be513c8","Type":"ContainerStarted","Data":"c36bff6df0e6225bcd39cebd8d8e7d8ae5aebac3ebd68fba9005ef3fa045f884"} Feb 18 06:21:48 crc kubenswrapper[4707]: I0218 06:21:48.956028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczzr" event={"ID":"c99c9870-17dd-4735-b01d-92479be513c8","Type":"ContainerStarted","Data":"2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3"} Feb 18 06:21:50 crc kubenswrapper[4707]: I0218 06:21:50.978322 4707 generic.go:334] "Generic (PLEG): container finished" podID="c99c9870-17dd-4735-b01d-92479be513c8" containerID="2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3" exitCode=0 Feb 18 06:21:50 crc kubenswrapper[4707]: I0218 06:21:50.978389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczzr" event={"ID":"c99c9870-17dd-4735-b01d-92479be513c8","Type":"ContainerDied","Data":"2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3"} Feb 18 06:21:53 crc kubenswrapper[4707]: I0218 06:21:53.000192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczzr" event={"ID":"c99c9870-17dd-4735-b01d-92479be513c8","Type":"ContainerStarted","Data":"5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426"} Feb 18 06:21:53 crc kubenswrapper[4707]: I0218 06:21:53.024085 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cczzr" podStartSLOduration=2.919644807 podStartE2EDuration="7.024061083s" podCreationTimestamp="2026-02-18 06:21:46 +0000 UTC" firstStartedPulling="2026-02-18 06:21:47.947898048 +0000 UTC m=+2044.595857182" lastFinishedPulling="2026-02-18 06:21:52.052314304 +0000 UTC m=+2048.700273458" observedRunningTime="2026-02-18 06:21:53.018323217 +0000 UTC m=+2049.666282371" watchObservedRunningTime="2026-02-18 06:21:53.024061083 +0000 UTC m=+2049.672020217" Feb 18 06:21:56 crc kubenswrapper[4707]: I0218 06:21:56.774893 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:56 crc kubenswrapper[4707]: I0218 06:21:56.775350 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:21:57 crc kubenswrapper[4707]: I0218 06:21:57.822604 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cczzr" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="registry-server" probeResult="failure" output=< Feb 18 06:21:57 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 06:21:57 crc kubenswrapper[4707]: > Feb 18 06:22:06 crc kubenswrapper[4707]: I0218 06:22:06.836706 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:22:06 crc kubenswrapper[4707]: I0218 06:22:06.904691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:22:07 crc kubenswrapper[4707]: I0218 06:22:07.073344 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cczzr"] Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.161510 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cczzr" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="registry-server" containerID="cri-o://5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426" gracePeriod=2 Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.678858 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.827114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-utilities\") pod \"c99c9870-17dd-4735-b01d-92479be513c8\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.827251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9krz\" (UniqueName: \"kubernetes.io/projected/c99c9870-17dd-4735-b01d-92479be513c8-kube-api-access-g9krz\") pod \"c99c9870-17dd-4735-b01d-92479be513c8\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.827333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-catalog-content\") pod \"c99c9870-17dd-4735-b01d-92479be513c8\" (UID: \"c99c9870-17dd-4735-b01d-92479be513c8\") " Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.828184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-utilities" (OuterVolumeSpecName: "utilities") pod "c99c9870-17dd-4735-b01d-92479be513c8" (UID: "c99c9870-17dd-4735-b01d-92479be513c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.828971 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.833747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99c9870-17dd-4735-b01d-92479be513c8-kube-api-access-g9krz" (OuterVolumeSpecName: "kube-api-access-g9krz") pod "c99c9870-17dd-4735-b01d-92479be513c8" (UID: "c99c9870-17dd-4735-b01d-92479be513c8"). InnerVolumeSpecName "kube-api-access-g9krz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.931083 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9krz\" (UniqueName: \"kubernetes.io/projected/c99c9870-17dd-4735-b01d-92479be513c8-kube-api-access-g9krz\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:08 crc kubenswrapper[4707]: I0218 06:22:08.979882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c99c9870-17dd-4735-b01d-92479be513c8" (UID: "c99c9870-17dd-4735-b01d-92479be513c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.033656 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99c9870-17dd-4735-b01d-92479be513c8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.172566 4707 generic.go:334] "Generic (PLEG): container finished" podID="c99c9870-17dd-4735-b01d-92479be513c8" containerID="5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426" exitCode=0 Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.172618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczzr" event={"ID":"c99c9870-17dd-4735-b01d-92479be513c8","Type":"ContainerDied","Data":"5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426"} Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.172650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczzr" event={"ID":"c99c9870-17dd-4735-b01d-92479be513c8","Type":"ContainerDied","Data":"c36bff6df0e6225bcd39cebd8d8e7d8ae5aebac3ebd68fba9005ef3fa045f884"} Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.172670 4707 scope.go:117] "RemoveContainer" containerID="5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.172854 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczzr" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.204840 4707 scope.go:117] "RemoveContainer" containerID="2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.241484 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cczzr"] Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.250869 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cczzr"] Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.253253 4707 scope.go:117] "RemoveContainer" containerID="c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.278183 4707 scope.go:117] "RemoveContainer" containerID="5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426" Feb 18 06:22:09 crc kubenswrapper[4707]: E0218 06:22:09.278687 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426\": container with ID starting with 5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426 not found: ID does not exist" containerID="5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.278742 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426"} err="failed to get container status \"5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426\": rpc error: code = NotFound desc = could not find container \"5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426\": container with ID starting with 5b404e5c5a7190b3ff30c0826637d7517450cd2aea95fbde3bcb8bbe894cf426 not found: ID does not exist" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.278770 4707 scope.go:117] "RemoveContainer" containerID="2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3" Feb 18 06:22:09 crc kubenswrapper[4707]: E0218 06:22:09.279057 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3\": container with ID starting with 2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3 not found: ID does not exist" containerID="2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.279179 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3"} err="failed to get container status \"2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3\": rpc error: code = NotFound desc = could not find container \"2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3\": container with ID starting with 2524d8351aabdb5fb9d4dad707ca08960ff638b102fb60d6f67464162b27e9b3 not found: ID does not exist" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.279276 4707 scope.go:117] "RemoveContainer" containerID="c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6" Feb 18 06:22:09 crc kubenswrapper[4707]: E0218 06:22:09.279554 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6\": container with ID starting with c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6 not found: ID does not exist" containerID="c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6" Feb 18 06:22:09 crc kubenswrapper[4707]: I0218 06:22:09.279578 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6"} err="failed to get container status \"c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6\": rpc error: code = NotFound desc = could not find container \"c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6\": container with ID starting with c1cad1c6f896f9a472bcac5065e6e35d2f8f1cf300d3961d6c534dc49d08d8f6 not found: ID does not exist" Feb 18 06:22:10 crc kubenswrapper[4707]: I0218 06:22:10.065748 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99c9870-17dd-4735-b01d-92479be513c8" path="/var/lib/kubelet/pods/c99c9870-17dd-4735-b01d-92479be513c8/volumes" Feb 18 06:22:22 crc kubenswrapper[4707]: I0218 06:22:22.296585 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf9b64ea-e740-4b80-b899-5f856afdd9c7" containerID="4aead9d6427735a8dc0c68c24d231b59b78013606ad600237cc36d955df116ef" exitCode=0 Feb 18 06:22:22 crc kubenswrapper[4707]: I0218 06:22:22.296629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" event={"ID":"cf9b64ea-e740-4b80-b899-5f856afdd9c7","Type":"ContainerDied","Data":"4aead9d6427735a8dc0c68c24d231b59b78013606ad600237cc36d955df116ef"} Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.744014 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.820129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.820196 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-nova-metadata-neutron-config-0\") pod \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.820302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-inventory\") pod \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.820501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-metadata-combined-ca-bundle\") pod \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.820617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-ssh-key-openstack-edpm-ipam\") pod \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.820642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9sw8\" (UniqueName: \"kubernetes.io/projected/cf9b64ea-e740-4b80-b899-5f856afdd9c7-kube-api-access-r9sw8\") pod \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\" (UID: \"cf9b64ea-e740-4b80-b899-5f856afdd9c7\") " Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.826036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cf9b64ea-e740-4b80-b899-5f856afdd9c7" (UID: "cf9b64ea-e740-4b80-b899-5f856afdd9c7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.827048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9b64ea-e740-4b80-b899-5f856afdd9c7-kube-api-access-r9sw8" (OuterVolumeSpecName: "kube-api-access-r9sw8") pod "cf9b64ea-e740-4b80-b899-5f856afdd9c7" (UID: "cf9b64ea-e740-4b80-b899-5f856afdd9c7"). InnerVolumeSpecName "kube-api-access-r9sw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.847949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-inventory" (OuterVolumeSpecName: "inventory") pod "cf9b64ea-e740-4b80-b899-5f856afdd9c7" (UID: "cf9b64ea-e740-4b80-b899-5f856afdd9c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.850279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cf9b64ea-e740-4b80-b899-5f856afdd9c7" (UID: "cf9b64ea-e740-4b80-b899-5f856afdd9c7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.851081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cf9b64ea-e740-4b80-b899-5f856afdd9c7" (UID: "cf9b64ea-e740-4b80-b899-5f856afdd9c7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.858414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cf9b64ea-e740-4b80-b899-5f856afdd9c7" (UID: "cf9b64ea-e740-4b80-b899-5f856afdd9c7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.921933 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.921968 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9sw8\" (UniqueName: \"kubernetes.io/projected/cf9b64ea-e740-4b80-b899-5f856afdd9c7-kube-api-access-r9sw8\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.921979 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.921989 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.922010 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:23 crc kubenswrapper[4707]: I0218 06:22:23.922020 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9b64ea-e740-4b80-b899-5f856afdd9c7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.314488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" event={"ID":"cf9b64ea-e740-4b80-b899-5f856afdd9c7","Type":"ContainerDied","Data":"acb6cf3988dfc24d902e7bbad3a637c801ce1622ad9e6a5dff7046053200e82a"} Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.314532 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb6cf3988dfc24d902e7bbad3a637c801ce1622ad9e6a5dff7046053200e82a" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.314884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.412584 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv"] Feb 18 06:22:24 crc kubenswrapper[4707]: E0218 06:22:24.413043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="extract-utilities" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.413063 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="extract-utilities" Feb 18 06:22:24 crc kubenswrapper[4707]: E0218 06:22:24.413084 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="extract-content" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.413090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="extract-content" Feb 18 06:22:24 crc kubenswrapper[4707]: E0218 06:22:24.413104 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9b64ea-e740-4b80-b899-5f856afdd9c7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.413111 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9b64ea-e740-4b80-b899-5f856afdd9c7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 06:22:24 crc kubenswrapper[4707]: E0218 06:22:24.413124 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="registry-server" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.413130 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="registry-server" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.413303 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9b64ea-e740-4b80-b899-5f856afdd9c7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.413320 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99c9870-17dd-4735-b01d-92479be513c8" containerName="registry-server" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.413957 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.416416 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.416416 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.416858 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.417982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.418945 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.446574 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv"] Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.537842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmqsb\" (UniqueName: \"kubernetes.io/projected/603500de-24c1-4ef6-a13a-24646a085b58-kube-api-access-xmqsb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.537925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.537980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.538000 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.538081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.639645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.639695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.639829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.639899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmqsb\" (UniqueName: \"kubernetes.io/projected/603500de-24c1-4ef6-a13a-24646a085b58-kube-api-access-xmqsb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.639940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.645861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.647040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.647142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.649402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.663561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmqsb\" (UniqueName: \"kubernetes.io/projected/603500de-24c1-4ef6-a13a-24646a085b58-kube-api-access-xmqsb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:24 crc kubenswrapper[4707]: I0218 06:22:24.746710 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:22:25 crc kubenswrapper[4707]: I0218 06:22:25.270244 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:22:25 crc kubenswrapper[4707]: I0218 06:22:25.270862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv"] Feb 18 06:22:25 crc kubenswrapper[4707]: I0218 06:22:25.324363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" event={"ID":"603500de-24c1-4ef6-a13a-24646a085b58","Type":"ContainerStarted","Data":"e835037fcc5fe3a7112858b46d3bc0a72441e12e530e73adcbbe39d3f89c146d"} Feb 18 06:22:26 crc kubenswrapper[4707]: I0218 06:22:26.333782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" event={"ID":"603500de-24c1-4ef6-a13a-24646a085b58","Type":"ContainerStarted","Data":"5a322aaae3f5053ebfe37df0b7dd92e8a6fceecce06cbf70ba2922d36e68bc76"} Feb 18 06:22:26 crc kubenswrapper[4707]: I0218 06:22:26.358766 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" podStartSLOduration=1.90520207 podStartE2EDuration="2.358750911s" podCreationTimestamp="2026-02-18 06:22:24 +0000 UTC" firstStartedPulling="2026-02-18 06:22:25.269975982 +0000 UTC m=+2081.917935116" lastFinishedPulling="2026-02-18 06:22:25.723524823 +0000 UTC m=+2082.371483957" observedRunningTime="2026-02-18 06:22:26.353473868 +0000 UTC m=+2083.001433002" watchObservedRunningTime="2026-02-18 06:22:26.358750911 +0000 UTC m=+2083.006710045" Feb 18 06:22:51 crc kubenswrapper[4707]: I0218 06:22:51.382943 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:22:51 crc kubenswrapper[4707]: I0218 06:22:51.383734 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.701111 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-smmvx"] Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.704365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.716153 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smmvx"] Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.739076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-catalog-content\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.739159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-utilities\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.739303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsrpm\" (UniqueName: \"kubernetes.io/projected/48c21772-a9e0-4650-b350-4ad517c600a1-kube-api-access-zsrpm\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.841420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsrpm\" (UniqueName: \"kubernetes.io/projected/48c21772-a9e0-4650-b350-4ad517c600a1-kube-api-access-zsrpm\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.841595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-catalog-content\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.841631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-utilities\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.842383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-utilities\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.842928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-catalog-content\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:11 crc kubenswrapper[4707]: I0218 06:23:11.862626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsrpm\" (UniqueName: \"kubernetes.io/projected/48c21772-a9e0-4650-b350-4ad517c600a1-kube-api-access-zsrpm\") pod \"certified-operators-smmvx\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:12 crc kubenswrapper[4707]: I0218 06:23:12.038713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:12 crc kubenswrapper[4707]: I0218 06:23:12.536659 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smmvx"] Feb 18 06:23:12 crc kubenswrapper[4707]: I0218 06:23:12.773749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerStarted","Data":"757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f"} Feb 18 06:23:12 crc kubenswrapper[4707]: I0218 06:23:12.773819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerStarted","Data":"9f4e80fd0fa3fe314add5425e24ec840fd0e80a5829d231d809773ced8e55f52"} Feb 18 06:23:13 crc kubenswrapper[4707]: I0218 06:23:13.787518 4707 generic.go:334] "Generic (PLEG): container finished" podID="48c21772-a9e0-4650-b350-4ad517c600a1" containerID="757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f" exitCode=0 Feb 18 06:23:13 crc kubenswrapper[4707]: I0218 06:23:13.787604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerDied","Data":"757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f"} Feb 18 06:23:14 crc kubenswrapper[4707]: I0218 06:23:14.801310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerStarted","Data":"c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7"} Feb 18 06:23:15 crc kubenswrapper[4707]: I0218 06:23:15.811763 4707 generic.go:334] "Generic (PLEG): container finished" podID="48c21772-a9e0-4650-b350-4ad517c600a1" containerID="c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7" exitCode=0 Feb 18 06:23:15 crc kubenswrapper[4707]: I0218 06:23:15.811834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerDied","Data":"c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7"} Feb 18 06:23:17 crc kubenswrapper[4707]: I0218 06:23:17.833273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerStarted","Data":"48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712"} Feb 18 06:23:17 crc kubenswrapper[4707]: I0218 06:23:17.857879 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-smmvx" podStartSLOduration=3.6597257819999998 podStartE2EDuration="6.857860278s" podCreationTimestamp="2026-02-18 06:23:11 +0000 UTC" firstStartedPulling="2026-02-18 06:23:13.789736432 +0000 UTC m=+2130.437695566" lastFinishedPulling="2026-02-18 06:23:16.987870928 +0000 UTC m=+2133.635830062" observedRunningTime="2026-02-18 06:23:17.854422575 +0000 UTC m=+2134.502381739" watchObservedRunningTime="2026-02-18 06:23:17.857860278 +0000 UTC m=+2134.505819422" Feb 18 06:23:21 crc kubenswrapper[4707]: I0218 06:23:21.382070 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:23:21 crc kubenswrapper[4707]: I0218 06:23:21.382401 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:23:22 crc kubenswrapper[4707]: I0218 06:23:22.039019 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:22 crc kubenswrapper[4707]: I0218 06:23:22.040158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:22 crc kubenswrapper[4707]: I0218 06:23:22.094594 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:22 crc kubenswrapper[4707]: I0218 06:23:22.958327 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:23 crc kubenswrapper[4707]: I0218 06:23:23.669222 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smmvx"] Feb 18 06:23:24 crc kubenswrapper[4707]: I0218 06:23:24.913401 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-smmvx" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="registry-server" containerID="cri-o://48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712" gracePeriod=2 Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.391657 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.425384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-utilities\") pod \"48c21772-a9e0-4650-b350-4ad517c600a1\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.425565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-catalog-content\") pod \"48c21772-a9e0-4650-b350-4ad517c600a1\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.425700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsrpm\" (UniqueName: \"kubernetes.io/projected/48c21772-a9e0-4650-b350-4ad517c600a1-kube-api-access-zsrpm\") pod \"48c21772-a9e0-4650-b350-4ad517c600a1\" (UID: \"48c21772-a9e0-4650-b350-4ad517c600a1\") " Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.426426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-utilities" (OuterVolumeSpecName: "utilities") pod "48c21772-a9e0-4650-b350-4ad517c600a1" (UID: "48c21772-a9e0-4650-b350-4ad517c600a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.430698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c21772-a9e0-4650-b350-4ad517c600a1-kube-api-access-zsrpm" (OuterVolumeSpecName: "kube-api-access-zsrpm") pod "48c21772-a9e0-4650-b350-4ad517c600a1" (UID: "48c21772-a9e0-4650-b350-4ad517c600a1"). InnerVolumeSpecName "kube-api-access-zsrpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.528348 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsrpm\" (UniqueName: \"kubernetes.io/projected/48c21772-a9e0-4650-b350-4ad517c600a1-kube-api-access-zsrpm\") on node \"crc\" DevicePath \"\"" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.528387 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.886707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48c21772-a9e0-4650-b350-4ad517c600a1" (UID: "48c21772-a9e0-4650-b350-4ad517c600a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.925005 4707 generic.go:334] "Generic (PLEG): container finished" podID="48c21772-a9e0-4650-b350-4ad517c600a1" containerID="48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712" exitCode=0 Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.925100 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smmvx" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.926072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerDied","Data":"48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712"} Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.926235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smmvx" event={"ID":"48c21772-a9e0-4650-b350-4ad517c600a1","Type":"ContainerDied","Data":"9f4e80fd0fa3fe314add5425e24ec840fd0e80a5829d231d809773ced8e55f52"} Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.926291 4707 scope.go:117] "RemoveContainer" containerID="48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.936497 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c21772-a9e0-4650-b350-4ad517c600a1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.951910 4707 scope.go:117] "RemoveContainer" containerID="c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.964242 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smmvx"] Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.972431 4707 scope.go:117] "RemoveContainer" containerID="757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f" Feb 18 06:23:25 crc kubenswrapper[4707]: I0218 06:23:25.974881 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-smmvx"] Feb 18 06:23:26 crc kubenswrapper[4707]: I0218 06:23:26.038975 4707 scope.go:117] "RemoveContainer" containerID="48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712" Feb 18 06:23:26 crc kubenswrapper[4707]: E0218 06:23:26.039491 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712\": container with ID starting with 48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712 not found: ID does not exist" containerID="48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712" Feb 18 06:23:26 crc kubenswrapper[4707]: I0218 06:23:26.039529 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712"} err="failed to get container status \"48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712\": rpc error: code = NotFound desc = could not find container \"48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712\": container with ID starting with 48ffdcc85e89f582d5a39edd4d8193f7ae63f24effa7a5a40348ea703858c712 not found: ID does not exist" Feb 18 06:23:26 crc kubenswrapper[4707]: I0218 06:23:26.039551 4707 scope.go:117] "RemoveContainer" containerID="c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7" Feb 18 06:23:26 crc kubenswrapper[4707]: E0218 06:23:26.039937 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7\": container with ID starting with c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7 not found: ID does not exist" containerID="c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7" Feb 18 06:23:26 crc kubenswrapper[4707]: I0218 06:23:26.040097 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7"} err="failed to get container status \"c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7\": rpc error: code = NotFound desc = could not find container \"c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7\": container with ID starting with c41ff7747b9260f4fecfeb18539eb8934a456167a34631f6e03c99fbbc4e6fe7 not found: ID does not exist" Feb 18 06:23:26 crc kubenswrapper[4707]: I0218 06:23:26.040219 4707 scope.go:117] "RemoveContainer" containerID="757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f" Feb 18 06:23:26 crc kubenswrapper[4707]: E0218 06:23:26.040529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f\": container with ID starting with 757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f not found: ID does not exist" containerID="757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f" Feb 18 06:23:26 crc kubenswrapper[4707]: I0218 06:23:26.040554 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f"} err="failed to get container status \"757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f\": rpc error: code = NotFound desc = could not find container \"757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f\": container with ID starting with 757c1a71c357db8c6dbd722446809b56afca74785a99e160f401b5f424cc840f not found: ID does not exist" Feb 18 06:23:26 crc kubenswrapper[4707]: I0218 06:23:26.065638 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" path="/var/lib/kubelet/pods/48c21772-a9e0-4650-b350-4ad517c600a1/volumes" Feb 18 06:23:51 crc kubenswrapper[4707]: I0218 06:23:51.381893 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:23:51 crc kubenswrapper[4707]: I0218 06:23:51.383421 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:23:51 crc kubenswrapper[4707]: I0218 06:23:51.383540 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:23:51 crc kubenswrapper[4707]: I0218 06:23:51.384161 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:23:51 crc kubenswrapper[4707]: I0218 06:23:51.384325 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" gracePeriod=600 Feb 18 06:23:51 crc kubenswrapper[4707]: E0218 06:23:51.560210 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:23:52 crc kubenswrapper[4707]: I0218 06:23:52.211327 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" exitCode=0 Feb 18 06:23:52 crc kubenswrapper[4707]: I0218 06:23:52.211374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820"} Feb 18 06:23:52 crc kubenswrapper[4707]: I0218 06:23:52.211407 4707 scope.go:117] "RemoveContainer" containerID="b793e97ac0e534321a4fdc530604143a86bad11d81b78baa1a5c35dfbdc0cbf8" Feb 18 06:23:52 crc kubenswrapper[4707]: I0218 06:23:52.212228 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:23:52 crc kubenswrapper[4707]: E0218 06:23:52.214011 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:24:03 crc kubenswrapper[4707]: I0218 06:24:03.053184 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:24:03 crc kubenswrapper[4707]: E0218 06:24:03.053983 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:24:18 crc kubenswrapper[4707]: I0218 06:24:18.053865 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:24:18 crc kubenswrapper[4707]: E0218 06:24:18.054626 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:24:33 crc kubenswrapper[4707]: I0218 06:24:33.053558 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:24:33 crc kubenswrapper[4707]: E0218 06:24:33.054341 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:24:45 crc kubenswrapper[4707]: I0218 06:24:45.053460 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:24:45 crc kubenswrapper[4707]: E0218 06:24:45.054340 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:24:56 crc kubenswrapper[4707]: I0218 06:24:56.053571 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:24:56 crc kubenswrapper[4707]: E0218 06:24:56.054743 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:25:07 crc kubenswrapper[4707]: I0218 06:25:07.053427 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:25:07 crc kubenswrapper[4707]: E0218 06:25:07.054057 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:25:19 crc kubenswrapper[4707]: I0218 06:25:19.053116 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:25:19 crc kubenswrapper[4707]: E0218 06:25:19.055124 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:25:32 crc kubenswrapper[4707]: I0218 06:25:32.053827 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:25:32 crc kubenswrapper[4707]: E0218 06:25:32.054672 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:25:45 crc kubenswrapper[4707]: I0218 06:25:45.053432 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:25:45 crc kubenswrapper[4707]: E0218 06:25:45.054175 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:25:58 crc kubenswrapper[4707]: I0218 06:25:58.052576 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:25:58 crc kubenswrapper[4707]: E0218 06:25:58.053431 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:26:13 crc kubenswrapper[4707]: I0218 06:26:13.053040 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:26:13 crc kubenswrapper[4707]: E0218 06:26:13.053826 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:26:28 crc kubenswrapper[4707]: I0218 06:26:28.053324 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:26:28 crc kubenswrapper[4707]: E0218 06:26:28.054138 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:26:28 crc kubenswrapper[4707]: I0218 06:26:28.700045 4707 generic.go:334] "Generic (PLEG): container finished" podID="603500de-24c1-4ef6-a13a-24646a085b58" containerID="5a322aaae3f5053ebfe37df0b7dd92e8a6fceecce06cbf70ba2922d36e68bc76" exitCode=0 Feb 18 06:26:28 crc kubenswrapper[4707]: I0218 06:26:28.700257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" event={"ID":"603500de-24c1-4ef6-a13a-24646a085b58","Type":"ContainerDied","Data":"5a322aaae3f5053ebfe37df0b7dd92e8a6fceecce06cbf70ba2922d36e68bc76"} Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.161573 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.311759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-combined-ca-bundle\") pod \"603500de-24c1-4ef6-a13a-24646a085b58\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.311897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmqsb\" (UniqueName: \"kubernetes.io/projected/603500de-24c1-4ef6-a13a-24646a085b58-kube-api-access-xmqsb\") pod \"603500de-24c1-4ef6-a13a-24646a085b58\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.311933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-secret-0\") pod \"603500de-24c1-4ef6-a13a-24646a085b58\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.311967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-ssh-key-openstack-edpm-ipam\") pod \"603500de-24c1-4ef6-a13a-24646a085b58\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.311987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-inventory\") pod \"603500de-24c1-4ef6-a13a-24646a085b58\" (UID: \"603500de-24c1-4ef6-a13a-24646a085b58\") " Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.319342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "603500de-24c1-4ef6-a13a-24646a085b58" (UID: "603500de-24c1-4ef6-a13a-24646a085b58"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.329051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603500de-24c1-4ef6-a13a-24646a085b58-kube-api-access-xmqsb" (OuterVolumeSpecName: "kube-api-access-xmqsb") pod "603500de-24c1-4ef6-a13a-24646a085b58" (UID: "603500de-24c1-4ef6-a13a-24646a085b58"). InnerVolumeSpecName "kube-api-access-xmqsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.340593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "603500de-24c1-4ef6-a13a-24646a085b58" (UID: "603500de-24c1-4ef6-a13a-24646a085b58"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.343120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-inventory" (OuterVolumeSpecName: "inventory") pod "603500de-24c1-4ef6-a13a-24646a085b58" (UID: "603500de-24c1-4ef6-a13a-24646a085b58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.346239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "603500de-24c1-4ef6-a13a-24646a085b58" (UID: "603500de-24c1-4ef6-a13a-24646a085b58"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.414868 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.414902 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmqsb\" (UniqueName: \"kubernetes.io/projected/603500de-24c1-4ef6-a13a-24646a085b58-kube-api-access-xmqsb\") on node \"crc\" DevicePath \"\"" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.414912 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.414921 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.414931 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603500de-24c1-4ef6-a13a-24646a085b58-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.737552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" event={"ID":"603500de-24c1-4ef6-a13a-24646a085b58","Type":"ContainerDied","Data":"e835037fcc5fe3a7112858b46d3bc0a72441e12e530e73adcbbe39d3f89c146d"} Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.737591 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e835037fcc5fe3a7112858b46d3bc0a72441e12e530e73adcbbe39d3f89c146d" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.737645 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.806514 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h"] Feb 18 06:26:30 crc kubenswrapper[4707]: E0218 06:26:30.806983 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603500de-24c1-4ef6-a13a-24646a085b58" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.807006 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="603500de-24c1-4ef6-a13a-24646a085b58" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 06:26:30 crc kubenswrapper[4707]: E0218 06:26:30.807032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="extract-utilities" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.807039 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="extract-utilities" Feb 18 06:26:30 crc kubenswrapper[4707]: E0218 06:26:30.807075 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="registry-server" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.807083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="registry-server" Feb 18 06:26:30 crc kubenswrapper[4707]: E0218 06:26:30.807095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="extract-content" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.807102 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="extract-content" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.807295 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="603500de-24c1-4ef6-a13a-24646a085b58" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.807323 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c21772-a9e0-4650-b350-4ad517c600a1" containerName="registry-server" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.808317 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.810632 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.811183 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.811253 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.811215 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.811563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.811846 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.812987 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.815999 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h"] Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.924977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.925010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.925062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmdgl\" (UniqueName: \"kubernetes.io/projected/4f8eff2f-2ca1-4fe4-8138-333c62468b97-kube-api-access-kmdgl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:30 crc kubenswrapper[4707]: I0218 06:26:30.925081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmdgl\" (UniqueName: \"kubernetes.io/projected/4f8eff2f-2ca1-4fe4-8138-333c62468b97-kube-api-access-kmdgl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.027649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.028620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.031629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.032210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.032503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.032566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.032682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.033282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.033901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.034172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.034651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.042672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmdgl\" (UniqueName: \"kubernetes.io/projected/4f8eff2f-2ca1-4fe4-8138-333c62468b97-kube-api-access-kmdgl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dck9h\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.128882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.638702 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h"] Feb 18 06:26:31 crc kubenswrapper[4707]: I0218 06:26:31.747788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" event={"ID":"4f8eff2f-2ca1-4fe4-8138-333c62468b97","Type":"ContainerStarted","Data":"44afcb6a3a245d465783fbcf18225b58937c35a23373e53703002d532b1af81c"} Feb 18 06:26:32 crc kubenswrapper[4707]: I0218 06:26:32.756736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" event={"ID":"4f8eff2f-2ca1-4fe4-8138-333c62468b97","Type":"ContainerStarted","Data":"95c76fd38548601a2c03567d78e6ba6469df1469c180c11a623b6e8d85008773"} Feb 18 06:26:41 crc kubenswrapper[4707]: I0218 06:26:41.052985 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:26:41 crc kubenswrapper[4707]: E0218 06:26:41.053774 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:26:54 crc kubenswrapper[4707]: I0218 06:26:54.065387 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:26:54 crc kubenswrapper[4707]: E0218 06:26:54.066545 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:27:07 crc kubenswrapper[4707]: I0218 06:27:07.053059 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:27:07 crc kubenswrapper[4707]: E0218 06:27:07.053837 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:27:18 crc kubenswrapper[4707]: I0218 06:27:18.053434 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:27:18 crc kubenswrapper[4707]: E0218 06:27:18.054676 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:27:33 crc kubenswrapper[4707]: I0218 06:27:33.052641 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:27:33 crc kubenswrapper[4707]: E0218 06:27:33.053454 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:27:45 crc kubenswrapper[4707]: I0218 06:27:45.053788 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:27:45 crc kubenswrapper[4707]: E0218 06:27:45.054766 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:27:57 crc kubenswrapper[4707]: I0218 06:27:57.053351 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:27:57 crc kubenswrapper[4707]: E0218 06:27:57.054124 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:28:08 crc kubenswrapper[4707]: I0218 06:28:08.053881 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:28:08 crc kubenswrapper[4707]: E0218 06:28:08.054973 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:28:22 crc kubenswrapper[4707]: I0218 06:28:22.053569 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:28:22 crc kubenswrapper[4707]: E0218 06:28:22.054432 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:28:34 crc kubenswrapper[4707]: I0218 06:28:34.066981 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:28:34 crc kubenswrapper[4707]: E0218 06:28:34.067926 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:28:49 crc kubenswrapper[4707]: I0218 06:28:49.053569 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:28:49 crc kubenswrapper[4707]: E0218 06:28:49.054673 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:29:02 crc kubenswrapper[4707]: I0218 06:29:02.054182 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:29:03 crc kubenswrapper[4707]: I0218 06:29:03.133544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"09181a591e1c61bbab693cbc930f34bf773db58270cde38cdc2b526bd9f87b1a"} Feb 18 06:29:03 crc kubenswrapper[4707]: I0218 06:29:03.162094 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" podStartSLOduration=152.711369495 podStartE2EDuration="2m33.162049336s" podCreationTimestamp="2026-02-18 06:26:30 +0000 UTC" firstStartedPulling="2026-02-18 06:26:31.63974508 +0000 UTC m=+2328.287704214" lastFinishedPulling="2026-02-18 06:26:32.090424921 +0000 UTC m=+2328.738384055" observedRunningTime="2026-02-18 06:26:32.779330946 +0000 UTC m=+2329.427290080" watchObservedRunningTime="2026-02-18 06:29:03.162049336 +0000 UTC m=+2479.810008470" Feb 18 06:29:06 crc kubenswrapper[4707]: I0218 06:29:06.175380 4707 generic.go:334] "Generic (PLEG): container finished" podID="4f8eff2f-2ca1-4fe4-8138-333c62468b97" containerID="95c76fd38548601a2c03567d78e6ba6469df1469c180c11a623b6e8d85008773" exitCode=0 Feb 18 06:29:06 crc kubenswrapper[4707]: I0218 06:29:06.175452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" event={"ID":"4f8eff2f-2ca1-4fe4-8138-333c62468b97","Type":"ContainerDied","Data":"95c76fd38548601a2c03567d78e6ba6469df1469c180c11a623b6e8d85008773"} Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.678454 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-ssh-key-openstack-edpm-ipam\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-extra-config-0\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-combined-ca-bundle\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-2\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-1\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-0\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774456 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmdgl\" (UniqueName: \"kubernetes.io/projected/4f8eff2f-2ca1-4fe4-8138-333c62468b97-kube-api-access-kmdgl\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-0\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774547 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-inventory\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-3\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.774667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-1\") pod \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\" (UID: \"4f8eff2f-2ca1-4fe4-8138-333c62468b97\") " Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.792334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8eff2f-2ca1-4fe4-8138-333c62468b97-kube-api-access-kmdgl" (OuterVolumeSpecName: "kube-api-access-kmdgl") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "kube-api-access-kmdgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.800325 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.805993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.806771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.808820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.812884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-inventory" (OuterVolumeSpecName: "inventory") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.813331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.819588 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.820708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.836881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.838440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "4f8eff2f-2ca1-4fe4-8138-333c62468b97" (UID: "4f8eff2f-2ca1-4fe4-8138-333c62468b97"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877410 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877449 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877462 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877477 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877489 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877502 4707 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877514 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877525 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877538 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877549 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4f8eff2f-2ca1-4fe4-8138-333c62468b97-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:07 crc kubenswrapper[4707]: I0218 06:29:07.877564 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmdgl\" (UniqueName: \"kubernetes.io/projected/4f8eff2f-2ca1-4fe4-8138-333c62468b97-kube-api-access-kmdgl\") on node \"crc\" DevicePath \"\"" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.192003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" event={"ID":"4f8eff2f-2ca1-4fe4-8138-333c62468b97","Type":"ContainerDied","Data":"44afcb6a3a245d465783fbcf18225b58937c35a23373e53703002d532b1af81c"} Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.192036 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dck9h" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.192043 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44afcb6a3a245d465783fbcf18225b58937c35a23373e53703002d532b1af81c" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.296021 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l"] Feb 18 06:29:08 crc kubenswrapper[4707]: E0218 06:29:08.296394 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8eff2f-2ca1-4fe4-8138-333c62468b97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.296411 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8eff2f-2ca1-4fe4-8138-333c62468b97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.296603 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8eff2f-2ca1-4fe4-8138-333c62468b97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.297348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.299945 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.300061 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b7sd6" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.300216 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.302030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.304493 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.314974 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l"] Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.391559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.391606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.391628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.391669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.391708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2bsv\" (UniqueName: \"kubernetes.io/projected/6afe228a-638b-41a3-ba74-556fbc740148-kube-api-access-d2bsv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.391738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.391832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.493771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.493899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.493921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.493939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.493980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.494017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2bsv\" (UniqueName: \"kubernetes.io/projected/6afe228a-638b-41a3-ba74-556fbc740148-kube-api-access-d2bsv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.494044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.502671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.502870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.502913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.502967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.503276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.512379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.512834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2bsv\" (UniqueName: \"kubernetes.io/projected/6afe228a-638b-41a3-ba74-556fbc740148-kube-api-access-d2bsv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:08 crc kubenswrapper[4707]: I0218 06:29:08.612029 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:29:09 crc kubenswrapper[4707]: I0218 06:29:09.139269 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l"] Feb 18 06:29:09 crc kubenswrapper[4707]: W0218 06:29:09.140152 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6afe228a_638b_41a3_ba74_556fbc740148.slice/crio-3cd37f662c5ccaa9f2d12f4734d6b5a661fbe5f8ee7c963e3d935b0fafd393c6 WatchSource:0}: Error finding container 3cd37f662c5ccaa9f2d12f4734d6b5a661fbe5f8ee7c963e3d935b0fafd393c6: Status 404 returned error can't find the container with id 3cd37f662c5ccaa9f2d12f4734d6b5a661fbe5f8ee7c963e3d935b0fafd393c6 Feb 18 06:29:09 crc kubenswrapper[4707]: I0218 06:29:09.143027 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:29:09 crc kubenswrapper[4707]: I0218 06:29:09.200187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" event={"ID":"6afe228a-638b-41a3-ba74-556fbc740148","Type":"ContainerStarted","Data":"3cd37f662c5ccaa9f2d12f4734d6b5a661fbe5f8ee7c963e3d935b0fafd393c6"} Feb 18 06:29:10 crc kubenswrapper[4707]: I0218 06:29:10.230897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" event={"ID":"6afe228a-638b-41a3-ba74-556fbc740148","Type":"ContainerStarted","Data":"94f75efdf1eafcce91d4ed7ada04f8d1e65436de7e55ff7d3d641293d6277868"} Feb 18 06:29:10 crc kubenswrapper[4707]: I0218 06:29:10.252082 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" podStartSLOduration=1.758019545 podStartE2EDuration="2.252064329s" podCreationTimestamp="2026-02-18 06:29:08 +0000 UTC" firstStartedPulling="2026-02-18 06:29:09.142846605 +0000 UTC m=+2485.790805739" lastFinishedPulling="2026-02-18 06:29:09.636891389 +0000 UTC m=+2486.284850523" observedRunningTime="2026-02-18 06:29:10.249633413 +0000 UTC m=+2486.897592547" watchObservedRunningTime="2026-02-18 06:29:10.252064329 +0000 UTC m=+2486.900023463" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.151838 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97"] Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.153783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.157576 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.158647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.161999 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97"] Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.213245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e25207ca-54be-481f-a992-f5de337090f9-secret-volume\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.213303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hdxj\" (UniqueName: \"kubernetes.io/projected/e25207ca-54be-481f-a992-f5de337090f9-kube-api-access-6hdxj\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.213485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e25207ca-54be-481f-a992-f5de337090f9-config-volume\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.314621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e25207ca-54be-481f-a992-f5de337090f9-config-volume\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.314715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e25207ca-54be-481f-a992-f5de337090f9-secret-volume\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.314752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hdxj\" (UniqueName: \"kubernetes.io/projected/e25207ca-54be-481f-a992-f5de337090f9-kube-api-access-6hdxj\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.315631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e25207ca-54be-481f-a992-f5de337090f9-config-volume\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.320421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e25207ca-54be-481f-a992-f5de337090f9-secret-volume\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.332665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hdxj\" (UniqueName: \"kubernetes.io/projected/e25207ca-54be-481f-a992-f5de337090f9-kube-api-access-6hdxj\") pod \"collect-profiles-29523270-fnc97\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.478098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:00 crc kubenswrapper[4707]: I0218 06:30:00.910970 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97"] Feb 18 06:30:02 crc kubenswrapper[4707]: I0218 06:30:02.219215 4707 generic.go:334] "Generic (PLEG): container finished" podID="e25207ca-54be-481f-a992-f5de337090f9" containerID="e8de6d7eff3647717012ff73d8fd074e59d48307892d8b51655a73f09bf12963" exitCode=0 Feb 18 06:30:02 crc kubenswrapper[4707]: I0218 06:30:02.244925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" event={"ID":"e25207ca-54be-481f-a992-f5de337090f9","Type":"ContainerDied","Data":"e8de6d7eff3647717012ff73d8fd074e59d48307892d8b51655a73f09bf12963"} Feb 18 06:30:02 crc kubenswrapper[4707]: I0218 06:30:02.244966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" event={"ID":"e25207ca-54be-481f-a992-f5de337090f9","Type":"ContainerStarted","Data":"45523d108ff080a8fb06824024cae95e096658dbed23696f85125e1d73cccebb"} Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.582693 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.733547 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e25207ca-54be-481f-a992-f5de337090f9-config-volume\") pod \"e25207ca-54be-481f-a992-f5de337090f9\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.733661 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hdxj\" (UniqueName: \"kubernetes.io/projected/e25207ca-54be-481f-a992-f5de337090f9-kube-api-access-6hdxj\") pod \"e25207ca-54be-481f-a992-f5de337090f9\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.733876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e25207ca-54be-481f-a992-f5de337090f9-secret-volume\") pod \"e25207ca-54be-481f-a992-f5de337090f9\" (UID: \"e25207ca-54be-481f-a992-f5de337090f9\") " Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.734731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25207ca-54be-481f-a992-f5de337090f9-config-volume" (OuterVolumeSpecName: "config-volume") pod "e25207ca-54be-481f-a992-f5de337090f9" (UID: "e25207ca-54be-481f-a992-f5de337090f9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.745074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25207ca-54be-481f-a992-f5de337090f9-kube-api-access-6hdxj" (OuterVolumeSpecName: "kube-api-access-6hdxj") pod "e25207ca-54be-481f-a992-f5de337090f9" (UID: "e25207ca-54be-481f-a992-f5de337090f9"). InnerVolumeSpecName "kube-api-access-6hdxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.767135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e25207ca-54be-481f-a992-f5de337090f9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e25207ca-54be-481f-a992-f5de337090f9" (UID: "e25207ca-54be-481f-a992-f5de337090f9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.836066 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e25207ca-54be-481f-a992-f5de337090f9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.836128 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hdxj\" (UniqueName: \"kubernetes.io/projected/e25207ca-54be-481f-a992-f5de337090f9-kube-api-access-6hdxj\") on node \"crc\" DevicePath \"\"" Feb 18 06:30:03 crc kubenswrapper[4707]: I0218 06:30:03.836143 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e25207ca-54be-481f-a992-f5de337090f9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:30:04 crc kubenswrapper[4707]: I0218 06:30:04.236569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" event={"ID":"e25207ca-54be-481f-a992-f5de337090f9","Type":"ContainerDied","Data":"45523d108ff080a8fb06824024cae95e096658dbed23696f85125e1d73cccebb"} Feb 18 06:30:04 crc kubenswrapper[4707]: I0218 06:30:04.236614 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97" Feb 18 06:30:04 crc kubenswrapper[4707]: I0218 06:30:04.236620 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45523d108ff080a8fb06824024cae95e096658dbed23696f85125e1d73cccebb" Feb 18 06:30:04 crc kubenswrapper[4707]: I0218 06:30:04.656030 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl"] Feb 18 06:30:04 crc kubenswrapper[4707]: I0218 06:30:04.664276 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523225-rdxbl"] Feb 18 06:30:06 crc kubenswrapper[4707]: I0218 06:30:06.065150 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb9f9f8-7162-479f-a789-dd3e61578ec4" path="/var/lib/kubelet/pods/fdb9f9f8-7162-479f-a789-dd3e61578ec4/volumes" Feb 18 06:30:49 crc kubenswrapper[4707]: I0218 06:30:49.825026 4707 scope.go:117] "RemoveContainer" containerID="68b3e2553bd0d758fbbe9e1154663118b0672dd6676e9b1c9eec82512a86a360" Feb 18 06:31:21 crc kubenswrapper[4707]: I0218 06:31:21.382244 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:31:21 crc kubenswrapper[4707]: I0218 06:31:21.382888 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:31:51 crc kubenswrapper[4707]: I0218 06:31:51.382527 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:31:51 crc kubenswrapper[4707]: I0218 06:31:51.383340 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:32:06 crc kubenswrapper[4707]: I0218 06:32:06.327475 4707 generic.go:334] "Generic (PLEG): container finished" podID="6afe228a-638b-41a3-ba74-556fbc740148" containerID="94f75efdf1eafcce91d4ed7ada04f8d1e65436de7e55ff7d3d641293d6277868" exitCode=0 Feb 18 06:32:06 crc kubenswrapper[4707]: I0218 06:32:06.327572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" event={"ID":"6afe228a-638b-41a3-ba74-556fbc740148","Type":"ContainerDied","Data":"94f75efdf1eafcce91d4ed7ada04f8d1e65436de7e55ff7d3d641293d6277868"} Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.804092 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.948242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-inventory\") pod \"6afe228a-638b-41a3-ba74-556fbc740148\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.948304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-2\") pod \"6afe228a-638b-41a3-ba74-556fbc740148\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.948443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-0\") pod \"6afe228a-638b-41a3-ba74-556fbc740148\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.948464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-1\") pod \"6afe228a-638b-41a3-ba74-556fbc740148\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.948566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ssh-key-openstack-edpm-ipam\") pod \"6afe228a-638b-41a3-ba74-556fbc740148\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.948591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-telemetry-combined-ca-bundle\") pod \"6afe228a-638b-41a3-ba74-556fbc740148\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.948631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2bsv\" (UniqueName: \"kubernetes.io/projected/6afe228a-638b-41a3-ba74-556fbc740148-kube-api-access-d2bsv\") pod \"6afe228a-638b-41a3-ba74-556fbc740148\" (UID: \"6afe228a-638b-41a3-ba74-556fbc740148\") " Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.953872 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6afe228a-638b-41a3-ba74-556fbc740148" (UID: "6afe228a-638b-41a3-ba74-556fbc740148"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.954346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afe228a-638b-41a3-ba74-556fbc740148-kube-api-access-d2bsv" (OuterVolumeSpecName: "kube-api-access-d2bsv") pod "6afe228a-638b-41a3-ba74-556fbc740148" (UID: "6afe228a-638b-41a3-ba74-556fbc740148"). InnerVolumeSpecName "kube-api-access-d2bsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.982141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "6afe228a-638b-41a3-ba74-556fbc740148" (UID: "6afe228a-638b-41a3-ba74-556fbc740148"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.983968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "6afe228a-638b-41a3-ba74-556fbc740148" (UID: "6afe228a-638b-41a3-ba74-556fbc740148"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.985974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-inventory" (OuterVolumeSpecName: "inventory") pod "6afe228a-638b-41a3-ba74-556fbc740148" (UID: "6afe228a-638b-41a3-ba74-556fbc740148"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.988107 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "6afe228a-638b-41a3-ba74-556fbc740148" (UID: "6afe228a-638b-41a3-ba74-556fbc740148"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:32:07 crc kubenswrapper[4707]: I0218 06:32:07.990389 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6afe228a-638b-41a3-ba74-556fbc740148" (UID: "6afe228a-638b-41a3-ba74-556fbc740148"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.050945 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.050981 4707 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.050994 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2bsv\" (UniqueName: \"kubernetes.io/projected/6afe228a-638b-41a3-ba74-556fbc740148-kube-api-access-d2bsv\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.051006 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.051019 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.051029 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.051041 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6afe228a-638b-41a3-ba74-556fbc740148-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.354554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" event={"ID":"6afe228a-638b-41a3-ba74-556fbc740148","Type":"ContainerDied","Data":"3cd37f662c5ccaa9f2d12f4734d6b5a661fbe5f8ee7c963e3d935b0fafd393c6"} Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.355181 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cd37f662c5ccaa9f2d12f4734d6b5a661fbe5f8ee7c963e3d935b0fafd393c6" Feb 18 06:32:08 crc kubenswrapper[4707]: I0218 06:32:08.354614 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l" Feb 18 06:32:21 crc kubenswrapper[4707]: I0218 06:32:21.382278 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:32:21 crc kubenswrapper[4707]: I0218 06:32:21.382903 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:32:21 crc kubenswrapper[4707]: I0218 06:32:21.382956 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:32:21 crc kubenswrapper[4707]: I0218 06:32:21.383859 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09181a591e1c61bbab693cbc930f34bf773db58270cde38cdc2b526bd9f87b1a"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:32:21 crc kubenswrapper[4707]: I0218 06:32:21.383929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://09181a591e1c61bbab693cbc930f34bf773db58270cde38cdc2b526bd9f87b1a" gracePeriod=600 Feb 18 06:32:22 crc kubenswrapper[4707]: I0218 06:32:22.500961 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="09181a591e1c61bbab693cbc930f34bf773db58270cde38cdc2b526bd9f87b1a" exitCode=0 Feb 18 06:32:22 crc kubenswrapper[4707]: I0218 06:32:22.501009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"09181a591e1c61bbab693cbc930f34bf773db58270cde38cdc2b526bd9f87b1a"} Feb 18 06:32:22 crc kubenswrapper[4707]: I0218 06:32:22.501491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc"} Feb 18 06:32:22 crc kubenswrapper[4707]: I0218 06:32:22.501517 4707 scope.go:117] "RemoveContainer" containerID="22c1ff44b3fc3903a44450c045d1493cd7de24aa9284599b82b0e9d6423b6820" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.642750 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kvl7t"] Feb 18 06:32:23 crc kubenswrapper[4707]: E0218 06:32:23.643536 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25207ca-54be-481f-a992-f5de337090f9" containerName="collect-profiles" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.643551 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25207ca-54be-481f-a992-f5de337090f9" containerName="collect-profiles" Feb 18 06:32:23 crc kubenswrapper[4707]: E0218 06:32:23.643571 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afe228a-638b-41a3-ba74-556fbc740148" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.643579 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afe228a-638b-41a3-ba74-556fbc740148" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.643783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afe228a-638b-41a3-ba74-556fbc740148" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.643823 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25207ca-54be-481f-a992-f5de337090f9" containerName="collect-profiles" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.645386 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.665892 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvl7t"] Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.777005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6ggk\" (UniqueName: \"kubernetes.io/projected/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-kube-api-access-v6ggk\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.777432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-utilities\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.777462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-catalog-content\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.879527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-utilities\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.879608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-catalog-content\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.879753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6ggk\" (UniqueName: \"kubernetes.io/projected/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-kube-api-access-v6ggk\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.880228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-utilities\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.880387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-catalog-content\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.906597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6ggk\" (UniqueName: \"kubernetes.io/projected/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-kube-api-access-v6ggk\") pod \"redhat-marketplace-kvl7t\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:23 crc kubenswrapper[4707]: I0218 06:32:23.968126 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:24 crc kubenswrapper[4707]: I0218 06:32:24.420116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvl7t"] Feb 18 06:32:24 crc kubenswrapper[4707]: W0218 06:32:24.429084 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb8142f8_1c05_4ca0_a9cc_4033d1017b62.slice/crio-2443dc651438a0e1400d33f151bd886e443f2c5b7654b2a153bb37ad5067141d WatchSource:0}: Error finding container 2443dc651438a0e1400d33f151bd886e443f2c5b7654b2a153bb37ad5067141d: Status 404 returned error can't find the container with id 2443dc651438a0e1400d33f151bd886e443f2c5b7654b2a153bb37ad5067141d Feb 18 06:32:24 crc kubenswrapper[4707]: I0218 06:32:24.521247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvl7t" event={"ID":"cb8142f8-1c05-4ca0-a9cc-4033d1017b62","Type":"ContainerStarted","Data":"2443dc651438a0e1400d33f151bd886e443f2c5b7654b2a153bb37ad5067141d"} Feb 18 06:32:25 crc kubenswrapper[4707]: I0218 06:32:25.549633 4707 generic.go:334] "Generic (PLEG): container finished" podID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerID="47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e" exitCode=0 Feb 18 06:32:25 crc kubenswrapper[4707]: I0218 06:32:25.550092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvl7t" event={"ID":"cb8142f8-1c05-4ca0-a9cc-4033d1017b62","Type":"ContainerDied","Data":"47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e"} Feb 18 06:32:26 crc kubenswrapper[4707]: I0218 06:32:26.562281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvl7t" event={"ID":"cb8142f8-1c05-4ca0-a9cc-4033d1017b62","Type":"ContainerStarted","Data":"4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833"} Feb 18 06:32:27 crc kubenswrapper[4707]: I0218 06:32:27.573582 4707 generic.go:334] "Generic (PLEG): container finished" podID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerID="4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833" exitCode=0 Feb 18 06:32:27 crc kubenswrapper[4707]: I0218 06:32:27.573634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvl7t" event={"ID":"cb8142f8-1c05-4ca0-a9cc-4033d1017b62","Type":"ContainerDied","Data":"4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833"} Feb 18 06:32:29 crc kubenswrapper[4707]: I0218 06:32:29.590970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvl7t" event={"ID":"cb8142f8-1c05-4ca0-a9cc-4033d1017b62","Type":"ContainerStarted","Data":"d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c"} Feb 18 06:32:29 crc kubenswrapper[4707]: I0218 06:32:29.617270 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kvl7t" podStartSLOduration=3.729466234 podStartE2EDuration="6.617250338s" podCreationTimestamp="2026-02-18 06:32:23 +0000 UTC" firstStartedPulling="2026-02-18 06:32:25.55174895 +0000 UTC m=+2682.199708084" lastFinishedPulling="2026-02-18 06:32:28.439533044 +0000 UTC m=+2685.087492188" observedRunningTime="2026-02-18 06:32:29.609757436 +0000 UTC m=+2686.257716580" watchObservedRunningTime="2026-02-18 06:32:29.617250338 +0000 UTC m=+2686.265209472" Feb 18 06:32:33 crc kubenswrapper[4707]: I0218 06:32:33.969194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:33 crc kubenswrapper[4707]: I0218 06:32:33.969736 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:34 crc kubenswrapper[4707]: I0218 06:32:34.022699 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:34 crc kubenswrapper[4707]: I0218 06:32:34.687004 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:34 crc kubenswrapper[4707]: I0218 06:32:34.731732 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvl7t"] Feb 18 06:32:36 crc kubenswrapper[4707]: I0218 06:32:36.659691 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kvl7t" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="registry-server" containerID="cri-o://d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c" gracePeriod=2 Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.097105 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.202511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6ggk\" (UniqueName: \"kubernetes.io/projected/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-kube-api-access-v6ggk\") pod \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.202551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-utilities\") pod \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.202573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-catalog-content\") pod \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\" (UID: \"cb8142f8-1c05-4ca0-a9cc-4033d1017b62\") " Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.203435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-utilities" (OuterVolumeSpecName: "utilities") pod "cb8142f8-1c05-4ca0-a9cc-4033d1017b62" (UID: "cb8142f8-1c05-4ca0-a9cc-4033d1017b62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.207233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-kube-api-access-v6ggk" (OuterVolumeSpecName: "kube-api-access-v6ggk") pod "cb8142f8-1c05-4ca0-a9cc-4033d1017b62" (UID: "cb8142f8-1c05-4ca0-a9cc-4033d1017b62"). InnerVolumeSpecName "kube-api-access-v6ggk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.234654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb8142f8-1c05-4ca0-a9cc-4033d1017b62" (UID: "cb8142f8-1c05-4ca0-a9cc-4033d1017b62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.304966 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6ggk\" (UniqueName: \"kubernetes.io/projected/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-kube-api-access-v6ggk\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.304996 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.305005 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb8142f8-1c05-4ca0-a9cc-4033d1017b62-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.669170 4707 generic.go:334] "Generic (PLEG): container finished" podID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerID="d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c" exitCode=0 Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.669210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvl7t" event={"ID":"cb8142f8-1c05-4ca0-a9cc-4033d1017b62","Type":"ContainerDied","Data":"d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c"} Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.669234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvl7t" event={"ID":"cb8142f8-1c05-4ca0-a9cc-4033d1017b62","Type":"ContainerDied","Data":"2443dc651438a0e1400d33f151bd886e443f2c5b7654b2a153bb37ad5067141d"} Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.669250 4707 scope.go:117] "RemoveContainer" containerID="d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.669246 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvl7t" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.707417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvl7t"] Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.712466 4707 scope.go:117] "RemoveContainer" containerID="4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.716152 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvl7t"] Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.728910 4707 scope.go:117] "RemoveContainer" containerID="47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.772659 4707 scope.go:117] "RemoveContainer" containerID="d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c" Feb 18 06:32:37 crc kubenswrapper[4707]: E0218 06:32:37.773331 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c\": container with ID starting with d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c not found: ID does not exist" containerID="d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.773428 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c"} err="failed to get container status \"d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c\": rpc error: code = NotFound desc = could not find container \"d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c\": container with ID starting with d8bf2c86378fc59730a8355a6e857be7fa6717b7978fd108686c0dd0ae4a6b7c not found: ID does not exist" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.773528 4707 scope.go:117] "RemoveContainer" containerID="4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833" Feb 18 06:32:37 crc kubenswrapper[4707]: E0218 06:32:37.773905 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833\": container with ID starting with 4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833 not found: ID does not exist" containerID="4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.773959 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833"} err="failed to get container status \"4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833\": rpc error: code = NotFound desc = could not find container \"4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833\": container with ID starting with 4ef72a0d1d52a72c1d6905a8ba5b60fb57337716775afca014e5d3ea7b490833 not found: ID does not exist" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.773992 4707 scope.go:117] "RemoveContainer" containerID="47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e" Feb 18 06:32:37 crc kubenswrapper[4707]: E0218 06:32:37.774246 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e\": container with ID starting with 47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e not found: ID does not exist" containerID="47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e" Feb 18 06:32:37 crc kubenswrapper[4707]: I0218 06:32:37.774319 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e"} err="failed to get container status \"47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e\": rpc error: code = NotFound desc = could not find container \"47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e\": container with ID starting with 47bcd515241cfa94679eb73bc57e7e415856ad6377f54b6ee197c9705269dd3e not found: ID does not exist" Feb 18 06:32:38 crc kubenswrapper[4707]: I0218 06:32:38.064292 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" path="/var/lib/kubelet/pods/cb8142f8-1c05-4ca0-a9cc-4033d1017b62/volumes" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.675396 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tw8cx"] Feb 18 06:32:39 crc kubenswrapper[4707]: E0218 06:32:39.676237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="extract-content" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.676253 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="extract-content" Feb 18 06:32:39 crc kubenswrapper[4707]: E0218 06:32:39.676267 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="registry-server" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.676278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="registry-server" Feb 18 06:32:39 crc kubenswrapper[4707]: E0218 06:32:39.676289 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="extract-utilities" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.676298 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="extract-utilities" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.676582 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb8142f8-1c05-4ca0-a9cc-4033d1017b62" containerName="registry-server" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.678283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.706172 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tw8cx"] Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.756380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-catalog-content\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.756631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-utilities\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.756701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplt2\" (UniqueName: \"kubernetes.io/projected/7179bfc2-3556-43fb-9226-d0857434de86-kube-api-access-wplt2\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.858751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-utilities\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.858877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wplt2\" (UniqueName: \"kubernetes.io/projected/7179bfc2-3556-43fb-9226-d0857434de86-kube-api-access-wplt2\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.859003 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-catalog-content\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.859425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-utilities\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.859491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-catalog-content\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:39 crc kubenswrapper[4707]: I0218 06:32:39.888649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplt2\" (UniqueName: \"kubernetes.io/projected/7179bfc2-3556-43fb-9226-d0857434de86-kube-api-access-wplt2\") pod \"redhat-operators-tw8cx\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:40 crc kubenswrapper[4707]: I0218 06:32:40.055696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:40 crc kubenswrapper[4707]: I0218 06:32:40.531982 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tw8cx"] Feb 18 06:32:40 crc kubenswrapper[4707]: I0218 06:32:40.706110 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw8cx" event={"ID":"7179bfc2-3556-43fb-9226-d0857434de86","Type":"ContainerStarted","Data":"e23143bf2284bff7aaf073f8e24a4d32d9aefd2e52d619c36978a13731916d67"} Feb 18 06:32:41 crc kubenswrapper[4707]: I0218 06:32:41.718187 4707 generic.go:334] "Generic (PLEG): container finished" podID="7179bfc2-3556-43fb-9226-d0857434de86" containerID="5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668" exitCode=0 Feb 18 06:32:41 crc kubenswrapper[4707]: I0218 06:32:41.718278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw8cx" event={"ID":"7179bfc2-3556-43fb-9226-d0857434de86","Type":"ContainerDied","Data":"5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668"} Feb 18 06:32:42 crc kubenswrapper[4707]: I0218 06:32:42.730030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw8cx" event={"ID":"7179bfc2-3556-43fb-9226-d0857434de86","Type":"ContainerStarted","Data":"33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f"} Feb 18 06:32:43 crc kubenswrapper[4707]: I0218 06:32:43.739958 4707 generic.go:334] "Generic (PLEG): container finished" podID="7179bfc2-3556-43fb-9226-d0857434de86" containerID="33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f" exitCode=0 Feb 18 06:32:43 crc kubenswrapper[4707]: I0218 06:32:43.740325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw8cx" event={"ID":"7179bfc2-3556-43fb-9226-d0857434de86","Type":"ContainerDied","Data":"33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f"} Feb 18 06:32:44 crc kubenswrapper[4707]: I0218 06:32:44.750695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw8cx" event={"ID":"7179bfc2-3556-43fb-9226-d0857434de86","Type":"ContainerStarted","Data":"61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a"} Feb 18 06:32:44 crc kubenswrapper[4707]: I0218 06:32:44.770897 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tw8cx" podStartSLOduration=3.298535764 podStartE2EDuration="5.77088017s" podCreationTimestamp="2026-02-18 06:32:39 +0000 UTC" firstStartedPulling="2026-02-18 06:32:41.720445583 +0000 UTC m=+2698.368404727" lastFinishedPulling="2026-02-18 06:32:44.192789999 +0000 UTC m=+2700.840749133" observedRunningTime="2026-02-18 06:32:44.769231846 +0000 UTC m=+2701.417190980" watchObservedRunningTime="2026-02-18 06:32:44.77088017 +0000 UTC m=+2701.418839304" Feb 18 06:32:50 crc kubenswrapper[4707]: I0218 06:32:50.072164 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:50 crc kubenswrapper[4707]: I0218 06:32:50.072837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:50 crc kubenswrapper[4707]: I0218 06:32:50.117860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:50 crc kubenswrapper[4707]: I0218 06:32:50.839244 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:50 crc kubenswrapper[4707]: I0218 06:32:50.894890 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tw8cx"] Feb 18 06:32:52 crc kubenswrapper[4707]: I0218 06:32:52.812251 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tw8cx" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="registry-server" containerID="cri-o://61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a" gracePeriod=2 Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.266253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.340320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-utilities\") pod \"7179bfc2-3556-43fb-9226-d0857434de86\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.340413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wplt2\" (UniqueName: \"kubernetes.io/projected/7179bfc2-3556-43fb-9226-d0857434de86-kube-api-access-wplt2\") pod \"7179bfc2-3556-43fb-9226-d0857434de86\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.340620 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-catalog-content\") pod \"7179bfc2-3556-43fb-9226-d0857434de86\" (UID: \"7179bfc2-3556-43fb-9226-d0857434de86\") " Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.341282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-utilities" (OuterVolumeSpecName: "utilities") pod "7179bfc2-3556-43fb-9226-d0857434de86" (UID: "7179bfc2-3556-43fb-9226-d0857434de86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.346035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7179bfc2-3556-43fb-9226-d0857434de86-kube-api-access-wplt2" (OuterVolumeSpecName: "kube-api-access-wplt2") pod "7179bfc2-3556-43fb-9226-d0857434de86" (UID: "7179bfc2-3556-43fb-9226-d0857434de86"). InnerVolumeSpecName "kube-api-access-wplt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.443156 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.443196 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wplt2\" (UniqueName: \"kubernetes.io/projected/7179bfc2-3556-43fb-9226-d0857434de86-kube-api-access-wplt2\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.466917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7179bfc2-3556-43fb-9226-d0857434de86" (UID: "7179bfc2-3556-43fb-9226-d0857434de86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.545479 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7179bfc2-3556-43fb-9226-d0857434de86-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.829237 4707 generic.go:334] "Generic (PLEG): container finished" podID="7179bfc2-3556-43fb-9226-d0857434de86" containerID="61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a" exitCode=0 Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.829318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw8cx" event={"ID":"7179bfc2-3556-43fb-9226-d0857434de86","Type":"ContainerDied","Data":"61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a"} Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.829353 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw8cx" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.829384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw8cx" event={"ID":"7179bfc2-3556-43fb-9226-d0857434de86","Type":"ContainerDied","Data":"e23143bf2284bff7aaf073f8e24a4d32d9aefd2e52d619c36978a13731916d67"} Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.829426 4707 scope.go:117] "RemoveContainer" containerID="61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.857817 4707 scope.go:117] "RemoveContainer" containerID="33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.883962 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tw8cx"] Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.899556 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tw8cx"] Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.911518 4707 scope.go:117] "RemoveContainer" containerID="5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.947546 4707 scope.go:117] "RemoveContainer" containerID="61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a" Feb 18 06:32:53 crc kubenswrapper[4707]: E0218 06:32:53.948367 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a\": container with ID starting with 61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a not found: ID does not exist" containerID="61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.948397 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a"} err="failed to get container status \"61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a\": rpc error: code = NotFound desc = could not find container \"61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a\": container with ID starting with 61e801f3078459ac0936c5d9969233a7183af67aa5f400a293f26ec38eda018a not found: ID does not exist" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.948419 4707 scope.go:117] "RemoveContainer" containerID="33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f" Feb 18 06:32:53 crc kubenswrapper[4707]: E0218 06:32:53.948741 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f\": container with ID starting with 33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f not found: ID does not exist" containerID="33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.948764 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f"} err="failed to get container status \"33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f\": rpc error: code = NotFound desc = could not find container \"33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f\": container with ID starting with 33ea31c433ef1adbf00bf5b367531c448e6371441fd3757971136bb1b047692f not found: ID does not exist" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.948829 4707 scope.go:117] "RemoveContainer" containerID="5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668" Feb 18 06:32:53 crc kubenswrapper[4707]: E0218 06:32:53.949198 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668\": container with ID starting with 5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668 not found: ID does not exist" containerID="5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668" Feb 18 06:32:53 crc kubenswrapper[4707]: I0218 06:32:53.949244 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668"} err="failed to get container status \"5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668\": rpc error: code = NotFound desc = could not find container \"5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668\": container with ID starting with 5a668c187409f0c7180f12b7fe97b3a6c24b5196c7f673e30fb64488c6bd5668 not found: ID does not exist" Feb 18 06:32:54 crc kubenswrapper[4707]: I0218 06:32:54.069743 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7179bfc2-3556-43fb-9226-d0857434de86" path="/var/lib/kubelet/pods/7179bfc2-3556-43fb-9226-d0857434de86/volumes" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.012754 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 06:33:10 crc kubenswrapper[4707]: E0218 06:33:10.013841 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="registry-server" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.013856 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="registry-server" Feb 18 06:33:10 crc kubenswrapper[4707]: E0218 06:33:10.013868 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="extract-content" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.013874 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="extract-content" Feb 18 06:33:10 crc kubenswrapper[4707]: E0218 06:33:10.013895 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="extract-utilities" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.013902 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="extract-utilities" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.014118 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7179bfc2-3556-43fb-9226-d0857434de86" containerName="registry-server" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.014727 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.017512 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.017997 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.020773 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.028572 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.078351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfn9q\" (UniqueName: \"kubernetes.io/projected/e369f41f-534e-48ee-bdcb-da26b742cfc3-kube-api-access-dfn9q\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.078599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.078680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.078782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.078893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.078965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.079032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.079112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-config-data\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.079247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.180756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.180827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.180870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.180909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-config-data\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.180950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.181010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfn9q\" (UniqueName: \"kubernetes.io/projected/e369f41f-534e-48ee-bdcb-da26b742cfc3-kube-api-access-dfn9q\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.181098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.181136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.181137 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.181444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.181458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.181616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.182342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.182361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-config-data\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.186610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.186734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.195476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.201061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfn9q\" (UniqueName: \"kubernetes.io/projected/e369f41f-534e-48ee-bdcb-da26b742cfc3-kube-api-access-dfn9q\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.209181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.339053 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.798029 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 06:33:10 crc kubenswrapper[4707]: I0218 06:33:10.989332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e369f41f-534e-48ee-bdcb-da26b742cfc3","Type":"ContainerStarted","Data":"8a70ea655f8a17102d760869bfe181df5573edbdd0ebec0f46cdff9a32a8dd56"} Feb 18 06:33:42 crc kubenswrapper[4707]: E0218 06:33:42.621727 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 18 06:33:42 crc kubenswrapper[4707]: E0218 06:33:42.622509 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dfn9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e369f41f-534e-48ee-bdcb-da26b742cfc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 06:33:42 crc kubenswrapper[4707]: E0218 06:33:42.623707 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e369f41f-534e-48ee-bdcb-da26b742cfc3" Feb 18 06:33:43 crc kubenswrapper[4707]: E0218 06:33:43.357229 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e369f41f-534e-48ee-bdcb-da26b742cfc3" Feb 18 06:33:58 crc kubenswrapper[4707]: I0218 06:33:58.479470 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 06:34:00 crc kubenswrapper[4707]: I0218 06:34:00.531545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e369f41f-534e-48ee-bdcb-da26b742cfc3","Type":"ContainerStarted","Data":"595bf5305d0d9237a2ee119ae6203e07713a8164a83768166db93d45cf8a5124"} Feb 18 06:34:00 crc kubenswrapper[4707]: I0218 06:34:00.567004 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.890809003 podStartE2EDuration="52.566985308s" podCreationTimestamp="2026-02-18 06:33:08 +0000 UTC" firstStartedPulling="2026-02-18 06:33:10.800185517 +0000 UTC m=+2727.448144661" lastFinishedPulling="2026-02-18 06:33:58.476361832 +0000 UTC m=+2775.124320966" observedRunningTime="2026-02-18 06:34:00.546555477 +0000 UTC m=+2777.194514611" watchObservedRunningTime="2026-02-18 06:34:00.566985308 +0000 UTC m=+2777.214944442" Feb 18 06:34:05 crc kubenswrapper[4707]: I0218 06:34:05.997752 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8nrlc"] Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.000754 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.007764 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nrlc"] Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.150069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz57j\" (UniqueName: \"kubernetes.io/projected/a5ccf929-aebd-4368-a7c5-2959dc0ba442-kube-api-access-kz57j\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.150259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-utilities\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.150412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-catalog-content\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.252018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-utilities\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.252132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-catalog-content\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.252189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz57j\" (UniqueName: \"kubernetes.io/projected/a5ccf929-aebd-4368-a7c5-2959dc0ba442-kube-api-access-kz57j\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.252661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-utilities\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.252721 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-catalog-content\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.275351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz57j\" (UniqueName: \"kubernetes.io/projected/a5ccf929-aebd-4368-a7c5-2959dc0ba442-kube-api-access-kz57j\") pod \"certified-operators-8nrlc\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.329818 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:06 crc kubenswrapper[4707]: I0218 06:34:06.930006 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nrlc"] Feb 18 06:34:07 crc kubenswrapper[4707]: I0218 06:34:07.599029 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerID="60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe" exitCode=0 Feb 18 06:34:07 crc kubenswrapper[4707]: I0218 06:34:07.599097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nrlc" event={"ID":"a5ccf929-aebd-4368-a7c5-2959dc0ba442","Type":"ContainerDied","Data":"60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe"} Feb 18 06:34:07 crc kubenswrapper[4707]: I0218 06:34:07.599375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nrlc" event={"ID":"a5ccf929-aebd-4368-a7c5-2959dc0ba442","Type":"ContainerStarted","Data":"e57e77a6e4ebe529bb17066d1af4c96af42edb701c27df66aeb9e126c2678960"} Feb 18 06:34:08 crc kubenswrapper[4707]: I0218 06:34:08.611223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nrlc" event={"ID":"a5ccf929-aebd-4368-a7c5-2959dc0ba442","Type":"ContainerStarted","Data":"a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14"} Feb 18 06:34:09 crc kubenswrapper[4707]: I0218 06:34:09.622674 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerID="a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14" exitCode=0 Feb 18 06:34:09 crc kubenswrapper[4707]: I0218 06:34:09.623038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nrlc" event={"ID":"a5ccf929-aebd-4368-a7c5-2959dc0ba442","Type":"ContainerDied","Data":"a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14"} Feb 18 06:34:09 crc kubenswrapper[4707]: I0218 06:34:09.624451 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:34:10 crc kubenswrapper[4707]: I0218 06:34:10.632580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nrlc" event={"ID":"a5ccf929-aebd-4368-a7c5-2959dc0ba442","Type":"ContainerStarted","Data":"00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef"} Feb 18 06:34:10 crc kubenswrapper[4707]: I0218 06:34:10.658189 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8nrlc" podStartSLOduration=3.248456116 podStartE2EDuration="5.658170369s" podCreationTimestamp="2026-02-18 06:34:05 +0000 UTC" firstStartedPulling="2026-02-18 06:34:07.601272329 +0000 UTC m=+2784.249231463" lastFinishedPulling="2026-02-18 06:34:10.010986582 +0000 UTC m=+2786.658945716" observedRunningTime="2026-02-18 06:34:10.649239687 +0000 UTC m=+2787.297198831" watchObservedRunningTime="2026-02-18 06:34:10.658170369 +0000 UTC m=+2787.306129503" Feb 18 06:34:16 crc kubenswrapper[4707]: I0218 06:34:16.330085 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:16 crc kubenswrapper[4707]: I0218 06:34:16.330637 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:16 crc kubenswrapper[4707]: I0218 06:34:16.375905 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:16 crc kubenswrapper[4707]: I0218 06:34:16.725915 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:16 crc kubenswrapper[4707]: I0218 06:34:16.771915 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nrlc"] Feb 18 06:34:18 crc kubenswrapper[4707]: I0218 06:34:18.697460 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8nrlc" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="registry-server" containerID="cri-o://00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef" gracePeriod=2 Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.202593 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.319089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-catalog-content\") pod \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.319248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz57j\" (UniqueName: \"kubernetes.io/projected/a5ccf929-aebd-4368-a7c5-2959dc0ba442-kube-api-access-kz57j\") pod \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.319314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-utilities\") pod \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\" (UID: \"a5ccf929-aebd-4368-a7c5-2959dc0ba442\") " Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.320440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-utilities" (OuterVolumeSpecName: "utilities") pod "a5ccf929-aebd-4368-a7c5-2959dc0ba442" (UID: "a5ccf929-aebd-4368-a7c5-2959dc0ba442"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.332494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ccf929-aebd-4368-a7c5-2959dc0ba442-kube-api-access-kz57j" (OuterVolumeSpecName: "kube-api-access-kz57j") pod "a5ccf929-aebd-4368-a7c5-2959dc0ba442" (UID: "a5ccf929-aebd-4368-a7c5-2959dc0ba442"). InnerVolumeSpecName "kube-api-access-kz57j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.379030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5ccf929-aebd-4368-a7c5-2959dc0ba442" (UID: "a5ccf929-aebd-4368-a7c5-2959dc0ba442"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.421239 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.421279 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz57j\" (UniqueName: \"kubernetes.io/projected/a5ccf929-aebd-4368-a7c5-2959dc0ba442-kube-api-access-kz57j\") on node \"crc\" DevicePath \"\"" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.421293 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ccf929-aebd-4368-a7c5-2959dc0ba442-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.706066 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerID="00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef" exitCode=0 Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.706107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nrlc" event={"ID":"a5ccf929-aebd-4368-a7c5-2959dc0ba442","Type":"ContainerDied","Data":"00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef"} Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.706133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nrlc" event={"ID":"a5ccf929-aebd-4368-a7c5-2959dc0ba442","Type":"ContainerDied","Data":"e57e77a6e4ebe529bb17066d1af4c96af42edb701c27df66aeb9e126c2678960"} Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.706152 4707 scope.go:117] "RemoveContainer" containerID="00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.706150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nrlc" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.725400 4707 scope.go:117] "RemoveContainer" containerID="a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.768406 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nrlc"] Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.781020 4707 scope.go:117] "RemoveContainer" containerID="60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.809549 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8nrlc"] Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.825458 4707 scope.go:117] "RemoveContainer" containerID="00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef" Feb 18 06:34:19 crc kubenswrapper[4707]: E0218 06:34:19.825894 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef\": container with ID starting with 00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef not found: ID does not exist" containerID="00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.825927 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef"} err="failed to get container status \"00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef\": rpc error: code = NotFound desc = could not find container \"00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef\": container with ID starting with 00e75464bb0e76e9782fc412e695a8b99eb71e460213dddc8238d007d60c6bef not found: ID does not exist" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.825951 4707 scope.go:117] "RemoveContainer" containerID="a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14" Feb 18 06:34:19 crc kubenswrapper[4707]: E0218 06:34:19.826127 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14\": container with ID starting with a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14 not found: ID does not exist" containerID="a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.826145 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14"} err="failed to get container status \"a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14\": rpc error: code = NotFound desc = could not find container \"a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14\": container with ID starting with a22aaa35ef8bbc3af885772af87da78715214fba87dc409d8a232e23c9f2ac14 not found: ID does not exist" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.826157 4707 scope.go:117] "RemoveContainer" containerID="60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe" Feb 18 06:34:19 crc kubenswrapper[4707]: E0218 06:34:19.826414 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe\": container with ID starting with 60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe not found: ID does not exist" containerID="60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe" Feb 18 06:34:19 crc kubenswrapper[4707]: I0218 06:34:19.826434 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe"} err="failed to get container status \"60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe\": rpc error: code = NotFound desc = could not find container \"60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe\": container with ID starting with 60b1a51089d777459766ef6c0fc0d015785dbbbb0c532d888214b0464cc03cfe not found: ID does not exist" Feb 18 06:34:20 crc kubenswrapper[4707]: I0218 06:34:20.062877 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" path="/var/lib/kubelet/pods/a5ccf929-aebd-4368-a7c5-2959dc0ba442/volumes" Feb 18 06:34:21 crc kubenswrapper[4707]: I0218 06:34:21.382308 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:34:21 crc kubenswrapper[4707]: I0218 06:34:21.383429 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:34:51 crc kubenswrapper[4707]: I0218 06:34:51.381827 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:34:51 crc kubenswrapper[4707]: I0218 06:34:51.382388 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:35:21 crc kubenswrapper[4707]: I0218 06:35:21.382072 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:35:21 crc kubenswrapper[4707]: I0218 06:35:21.382654 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:35:21 crc kubenswrapper[4707]: I0218 06:35:21.382710 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:35:21 crc kubenswrapper[4707]: I0218 06:35:21.383622 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:35:21 crc kubenswrapper[4707]: I0218 06:35:21.383693 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" gracePeriod=600 Feb 18 06:35:21 crc kubenswrapper[4707]: E0218 06:35:21.535410 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:35:22 crc kubenswrapper[4707]: I0218 06:35:22.502733 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" exitCode=0 Feb 18 06:35:22 crc kubenswrapper[4707]: I0218 06:35:22.502776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc"} Feb 18 06:35:22 crc kubenswrapper[4707]: I0218 06:35:22.502868 4707 scope.go:117] "RemoveContainer" containerID="09181a591e1c61bbab693cbc930f34bf773db58270cde38cdc2b526bd9f87b1a" Feb 18 06:35:22 crc kubenswrapper[4707]: I0218 06:35:22.503549 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:35:22 crc kubenswrapper[4707]: E0218 06:35:22.503873 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:35:37 crc kubenswrapper[4707]: I0218 06:35:37.053160 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:35:37 crc kubenswrapper[4707]: E0218 06:35:37.054020 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:35:50 crc kubenswrapper[4707]: I0218 06:35:50.053482 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:35:50 crc kubenswrapper[4707]: E0218 06:35:50.054239 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:36:04 crc kubenswrapper[4707]: I0218 06:36:04.066187 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:36:04 crc kubenswrapper[4707]: E0218 06:36:04.067290 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:36:16 crc kubenswrapper[4707]: I0218 06:36:16.056011 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:36:16 crc kubenswrapper[4707]: E0218 06:36:16.058626 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:36:27 crc kubenswrapper[4707]: I0218 06:36:27.054259 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:36:27 crc kubenswrapper[4707]: E0218 06:36:27.055453 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:36:39 crc kubenswrapper[4707]: I0218 06:36:39.053697 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:36:39 crc kubenswrapper[4707]: E0218 06:36:39.054519 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:36:54 crc kubenswrapper[4707]: I0218 06:36:54.054186 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:36:54 crc kubenswrapper[4707]: E0218 06:36:54.055689 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:37:09 crc kubenswrapper[4707]: I0218 06:37:09.054431 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:37:09 crc kubenswrapper[4707]: E0218 06:37:09.055421 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:37:22 crc kubenswrapper[4707]: I0218 06:37:22.053321 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:37:22 crc kubenswrapper[4707]: E0218 06:37:22.054296 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:37:36 crc kubenswrapper[4707]: I0218 06:37:36.054186 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:37:36 crc kubenswrapper[4707]: E0218 06:37:36.055365 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:37:51 crc kubenswrapper[4707]: I0218 06:37:51.052920 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:37:51 crc kubenswrapper[4707]: E0218 06:37:51.054019 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:38:05 crc kubenswrapper[4707]: I0218 06:38:05.053133 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:38:05 crc kubenswrapper[4707]: E0218 06:38:05.054021 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:38:19 crc kubenswrapper[4707]: I0218 06:38:19.053358 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:38:19 crc kubenswrapper[4707]: E0218 06:38:19.054321 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:38:34 crc kubenswrapper[4707]: I0218 06:38:34.081830 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:38:34 crc kubenswrapper[4707]: E0218 06:38:34.082753 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:38:49 crc kubenswrapper[4707]: I0218 06:38:49.054147 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:38:49 crc kubenswrapper[4707]: E0218 06:38:49.057010 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.653231 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-56pwk"] Feb 18 06:39:00 crc kubenswrapper[4707]: E0218 06:39:00.654577 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="registry-server" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.654596 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="registry-server" Feb 18 06:39:00 crc kubenswrapper[4707]: E0218 06:39:00.654626 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="extract-utilities" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.654634 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="extract-utilities" Feb 18 06:39:00 crc kubenswrapper[4707]: E0218 06:39:00.654656 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="extract-content" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.654663 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="extract-content" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.654924 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ccf929-aebd-4368-a7c5-2959dc0ba442" containerName="registry-server" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.657251 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.682231 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56pwk"] Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.810458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-catalog-content\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.810638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgh9d\" (UniqueName: \"kubernetes.io/projected/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-kube-api-access-mgh9d\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.810660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-utilities\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.913348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgh9d\" (UniqueName: \"kubernetes.io/projected/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-kube-api-access-mgh9d\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.913408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-utilities\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.913512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-catalog-content\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.914226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-catalog-content\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.914939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-utilities\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:00 crc kubenswrapper[4707]: I0218 06:39:00.940189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgh9d\" (UniqueName: \"kubernetes.io/projected/abf2b1e9-91f8-41a4-95b4-a14e4af58f6f-kube-api-access-mgh9d\") pod \"community-operators-56pwk\" (UID: \"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f\") " pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:01 crc kubenswrapper[4707]: I0218 06:39:01.013330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:01 crc kubenswrapper[4707]: I0218 06:39:01.600902 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56pwk"] Feb 18 06:39:01 crc kubenswrapper[4707]: W0218 06:39:01.611957 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabf2b1e9_91f8_41a4_95b4_a14e4af58f6f.slice/crio-dc109f197bedc82cb7b86d4f4f7f24da80bfa4ad737f7e8bb9f17cb55f533bfd WatchSource:0}: Error finding container dc109f197bedc82cb7b86d4f4f7f24da80bfa4ad737f7e8bb9f17cb55f533bfd: Status 404 returned error can't find the container with id dc109f197bedc82cb7b86d4f4f7f24da80bfa4ad737f7e8bb9f17cb55f533bfd Feb 18 06:39:01 crc kubenswrapper[4707]: I0218 06:39:01.906113 4707 generic.go:334] "Generic (PLEG): container finished" podID="abf2b1e9-91f8-41a4-95b4-a14e4af58f6f" containerID="d5bd425f9fdf3ac78824f8c60c9e7ec6ce03fa7670b5461d9b751b6e2971614c" exitCode=0 Feb 18 06:39:01 crc kubenswrapper[4707]: I0218 06:39:01.906228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56pwk" event={"ID":"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f","Type":"ContainerDied","Data":"d5bd425f9fdf3ac78824f8c60c9e7ec6ce03fa7670b5461d9b751b6e2971614c"} Feb 18 06:39:01 crc kubenswrapper[4707]: I0218 06:39:01.906686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56pwk" event={"ID":"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f","Type":"ContainerStarted","Data":"dc109f197bedc82cb7b86d4f4f7f24da80bfa4ad737f7e8bb9f17cb55f533bfd"} Feb 18 06:39:02 crc kubenswrapper[4707]: I0218 06:39:02.054404 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:39:02 crc kubenswrapper[4707]: E0218 06:39:02.054675 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:39:07 crc kubenswrapper[4707]: I0218 06:39:07.965734 4707 generic.go:334] "Generic (PLEG): container finished" podID="abf2b1e9-91f8-41a4-95b4-a14e4af58f6f" containerID="f58eb9d38b1b578666aea07344d195ed3754c3747962dbf87fe126944360cd4c" exitCode=0 Feb 18 06:39:07 crc kubenswrapper[4707]: I0218 06:39:07.966443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56pwk" event={"ID":"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f","Type":"ContainerDied","Data":"f58eb9d38b1b578666aea07344d195ed3754c3747962dbf87fe126944360cd4c"} Feb 18 06:39:08 crc kubenswrapper[4707]: I0218 06:39:08.978627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56pwk" event={"ID":"abf2b1e9-91f8-41a4-95b4-a14e4af58f6f","Type":"ContainerStarted","Data":"d94351859dbd5ad277eec5f1f46ffe6c6056e3e94d524d78b0f73865ed9c6ff5"} Feb 18 06:39:11 crc kubenswrapper[4707]: I0218 06:39:11.013778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:11 crc kubenswrapper[4707]: I0218 06:39:11.014185 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:11 crc kubenswrapper[4707]: I0218 06:39:11.068487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:11 crc kubenswrapper[4707]: I0218 06:39:11.089049 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-56pwk" podStartSLOduration=4.252889938 podStartE2EDuration="11.089021789s" podCreationTimestamp="2026-02-18 06:39:00 +0000 UTC" firstStartedPulling="2026-02-18 06:39:01.908054638 +0000 UTC m=+3078.556013772" lastFinishedPulling="2026-02-18 06:39:08.744186489 +0000 UTC m=+3085.392145623" observedRunningTime="2026-02-18 06:39:09.007144671 +0000 UTC m=+3085.655103815" watchObservedRunningTime="2026-02-18 06:39:11.089021789 +0000 UTC m=+3087.736980933" Feb 18 06:39:15 crc kubenswrapper[4707]: I0218 06:39:15.053401 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:39:15 crc kubenswrapper[4707]: E0218 06:39:15.054068 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:39:21 crc kubenswrapper[4707]: I0218 06:39:21.083243 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-56pwk" Feb 18 06:39:21 crc kubenswrapper[4707]: I0218 06:39:21.174073 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56pwk"] Feb 18 06:39:21 crc kubenswrapper[4707]: I0218 06:39:21.212976 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b26jp"] Feb 18 06:39:21 crc kubenswrapper[4707]: I0218 06:39:21.213238 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b26jp" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="registry-server" containerID="cri-o://f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b" gracePeriod=2 Feb 18 06:39:21 crc kubenswrapper[4707]: I0218 06:39:21.965291 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b26jp" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.108501 4707 generic.go:334] "Generic (PLEG): container finished" podID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerID="f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b" exitCode=0 Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.108746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b26jp" event={"ID":"0f6f6f77-5e62-4186-bea4-19b13aa2a79a","Type":"ContainerDied","Data":"f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b"} Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.108781 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b26jp" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.108827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b26jp" event={"ID":"0f6f6f77-5e62-4186-bea4-19b13aa2a79a","Type":"ContainerDied","Data":"45e3c61e1a9c18a32f5e29efce9d8cde4879ffb331e6076ddca8e5771642963c"} Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.108851 4707 scope.go:117] "RemoveContainer" containerID="f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.124005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-utilities\") pod \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.125482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-utilities" (OuterVolumeSpecName: "utilities") pod "0f6f6f77-5e62-4186-bea4-19b13aa2a79a" (UID: "0f6f6f77-5e62-4186-bea4-19b13aa2a79a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.125664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9w9g\" (UniqueName: \"kubernetes.io/projected/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-kube-api-access-d9w9g\") pod \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.126092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-catalog-content\") pod \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\" (UID: \"0f6f6f77-5e62-4186-bea4-19b13aa2a79a\") " Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.127855 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.143179 4707 scope.go:117] "RemoveContainer" containerID="8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.177508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-kube-api-access-d9w9g" (OuterVolumeSpecName: "kube-api-access-d9w9g") pod "0f6f6f77-5e62-4186-bea4-19b13aa2a79a" (UID: "0f6f6f77-5e62-4186-bea4-19b13aa2a79a"). InnerVolumeSpecName "kube-api-access-d9w9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.221162 4707 scope.go:117] "RemoveContainer" containerID="b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.231033 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9w9g\" (UniqueName: \"kubernetes.io/projected/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-kube-api-access-d9w9g\") on node \"crc\" DevicePath \"\"" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.233180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f6f6f77-5e62-4186-bea4-19b13aa2a79a" (UID: "0f6f6f77-5e62-4186-bea4-19b13aa2a79a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.279394 4707 scope.go:117] "RemoveContainer" containerID="f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b" Feb 18 06:39:22 crc kubenswrapper[4707]: E0218 06:39:22.280352 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b\": container with ID starting with f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b not found: ID does not exist" containerID="f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.280433 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b"} err="failed to get container status \"f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b\": rpc error: code = NotFound desc = could not find container \"f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b\": container with ID starting with f529e88b519d1add914cb2f0b5d8ea0c3d9ef63567984367013e53a9aaf6e89b not found: ID does not exist" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.280477 4707 scope.go:117] "RemoveContainer" containerID="8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc" Feb 18 06:39:22 crc kubenswrapper[4707]: E0218 06:39:22.281321 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc\": container with ID starting with 8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc not found: ID does not exist" containerID="8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.281374 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc"} err="failed to get container status \"8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc\": rpc error: code = NotFound desc = could not find container \"8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc\": container with ID starting with 8df3e722fc10b89ff57e2d5e364800fcf8b6721a0c4580973fd7cadde3e3c4bc not found: ID does not exist" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.281407 4707 scope.go:117] "RemoveContainer" containerID="b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee" Feb 18 06:39:22 crc kubenswrapper[4707]: E0218 06:39:22.281732 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee\": container with ID starting with b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee not found: ID does not exist" containerID="b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.281776 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee"} err="failed to get container status \"b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee\": rpc error: code = NotFound desc = could not find container \"b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee\": container with ID starting with b11deaa9330d2aa8b25d1689e66bb3edaa412d8d6a005b2a0900cefb6b4a1bee not found: ID does not exist" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.333879 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6f6f77-5e62-4186-bea4-19b13aa2a79a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.463580 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b26jp"] Feb 18 06:39:22 crc kubenswrapper[4707]: I0218 06:39:22.474661 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b26jp"] Feb 18 06:39:24 crc kubenswrapper[4707]: I0218 06:39:24.065056 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" path="/var/lib/kubelet/pods/0f6f6f77-5e62-4186-bea4-19b13aa2a79a/volumes" Feb 18 06:39:30 crc kubenswrapper[4707]: I0218 06:39:30.054085 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:39:30 crc kubenswrapper[4707]: E0218 06:39:30.055773 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:39:43 crc kubenswrapper[4707]: I0218 06:39:43.053986 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:39:43 crc kubenswrapper[4707]: E0218 06:39:43.054629 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:39:58 crc kubenswrapper[4707]: I0218 06:39:58.053093 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:39:58 crc kubenswrapper[4707]: E0218 06:39:58.053881 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:40:12 crc kubenswrapper[4707]: I0218 06:40:12.053129 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:40:12 crc kubenswrapper[4707]: E0218 06:40:12.053844 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:40:23 crc kubenswrapper[4707]: I0218 06:40:23.053898 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:40:23 crc kubenswrapper[4707]: I0218 06:40:23.686133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"800e6e55b8d062e998b7bb677fcc99ddd2ea1329b2dda358c0533660d33286f2"} Feb 18 06:41:28 crc kubenswrapper[4707]: I0218 06:41:28.185516 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-878756b99-xx5vn" podUID="a6b4c749-b753-42b9-8bc7-fb25121f0ea8" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.383329 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j5lqp"] Feb 18 06:42:42 crc kubenswrapper[4707]: E0218 06:42:42.384246 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="registry-server" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.384260 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="registry-server" Feb 18 06:42:42 crc kubenswrapper[4707]: E0218 06:42:42.384288 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="extract-utilities" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.384294 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="extract-utilities" Feb 18 06:42:42 crc kubenswrapper[4707]: E0218 06:42:42.384321 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="extract-content" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.384327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="extract-content" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.384509 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6f6f77-5e62-4186-bea4-19b13aa2a79a" containerName="registry-server" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.385913 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.418465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5lqp"] Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.530755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8tl\" (UniqueName: \"kubernetes.io/projected/80f3c94b-0bcd-4184-b450-5c9670e9598f-kube-api-access-5h8tl\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.531123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-catalog-content\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.531194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-utilities\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.633458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-catalog-content\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.633520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-utilities\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.633621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8tl\" (UniqueName: \"kubernetes.io/projected/80f3c94b-0bcd-4184-b450-5c9670e9598f-kube-api-access-5h8tl\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.634123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-catalog-content\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.634183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-utilities\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.653377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8tl\" (UniqueName: \"kubernetes.io/projected/80f3c94b-0bcd-4184-b450-5c9670e9598f-kube-api-access-5h8tl\") pod \"redhat-marketplace-j5lqp\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:42 crc kubenswrapper[4707]: I0218 06:42:42.709282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:43 crc kubenswrapper[4707]: I0218 06:42:43.291330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5lqp"] Feb 18 06:42:43 crc kubenswrapper[4707]: I0218 06:42:43.950946 4707 generic.go:334] "Generic (PLEG): container finished" podID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerID="fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204" exitCode=0 Feb 18 06:42:43 crc kubenswrapper[4707]: I0218 06:42:43.950992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5lqp" event={"ID":"80f3c94b-0bcd-4184-b450-5c9670e9598f","Type":"ContainerDied","Data":"fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204"} Feb 18 06:42:43 crc kubenswrapper[4707]: I0218 06:42:43.952397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5lqp" event={"ID":"80f3c94b-0bcd-4184-b450-5c9670e9598f","Type":"ContainerStarted","Data":"12d41caf08a7d7c496f908683c78af9d204cf6fd739bb160a17dd9ec98ad469a"} Feb 18 06:42:43 crc kubenswrapper[4707]: I0218 06:42:43.954989 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:42:44 crc kubenswrapper[4707]: I0218 06:42:44.962308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5lqp" event={"ID":"80f3c94b-0bcd-4184-b450-5c9670e9598f","Type":"ContainerStarted","Data":"390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980"} Feb 18 06:42:46 crc kubenswrapper[4707]: I0218 06:42:46.984583 4707 generic.go:334] "Generic (PLEG): container finished" podID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerID="390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980" exitCode=0 Feb 18 06:42:46 crc kubenswrapper[4707]: I0218 06:42:46.984592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5lqp" event={"ID":"80f3c94b-0bcd-4184-b450-5c9670e9598f","Type":"ContainerDied","Data":"390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980"} Feb 18 06:42:48 crc kubenswrapper[4707]: I0218 06:42:48.009900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5lqp" event={"ID":"80f3c94b-0bcd-4184-b450-5c9670e9598f","Type":"ContainerStarted","Data":"15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f"} Feb 18 06:42:48 crc kubenswrapper[4707]: I0218 06:42:48.033653 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j5lqp" podStartSLOduration=2.588659802 podStartE2EDuration="6.033631962s" podCreationTimestamp="2026-02-18 06:42:42 +0000 UTC" firstStartedPulling="2026-02-18 06:42:43.954757977 +0000 UTC m=+3300.602717111" lastFinishedPulling="2026-02-18 06:42:47.399730137 +0000 UTC m=+3304.047689271" observedRunningTime="2026-02-18 06:42:48.025812671 +0000 UTC m=+3304.673771805" watchObservedRunningTime="2026-02-18 06:42:48.033631962 +0000 UTC m=+3304.681591096" Feb 18 06:42:51 crc kubenswrapper[4707]: I0218 06:42:51.382392 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:42:51 crc kubenswrapper[4707]: I0218 06:42:51.383031 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:42:52 crc kubenswrapper[4707]: I0218 06:42:52.710338 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:52 crc kubenswrapper[4707]: I0218 06:42:52.710818 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:52 crc kubenswrapper[4707]: I0218 06:42:52.761690 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:53 crc kubenswrapper[4707]: I0218 06:42:53.105562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:53 crc kubenswrapper[4707]: I0218 06:42:53.153375 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5lqp"] Feb 18 06:42:55 crc kubenswrapper[4707]: I0218 06:42:55.073732 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j5lqp" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="registry-server" containerID="cri-o://15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f" gracePeriod=2 Feb 18 06:42:55 crc kubenswrapper[4707]: I0218 06:42:55.858215 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.001011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-catalog-content\") pod \"80f3c94b-0bcd-4184-b450-5c9670e9598f\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.001190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h8tl\" (UniqueName: \"kubernetes.io/projected/80f3c94b-0bcd-4184-b450-5c9670e9598f-kube-api-access-5h8tl\") pod \"80f3c94b-0bcd-4184-b450-5c9670e9598f\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.001356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-utilities\") pod \"80f3c94b-0bcd-4184-b450-5c9670e9598f\" (UID: \"80f3c94b-0bcd-4184-b450-5c9670e9598f\") " Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.002217 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-utilities" (OuterVolumeSpecName: "utilities") pod "80f3c94b-0bcd-4184-b450-5c9670e9598f" (UID: "80f3c94b-0bcd-4184-b450-5c9670e9598f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.008070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f3c94b-0bcd-4184-b450-5c9670e9598f-kube-api-access-5h8tl" (OuterVolumeSpecName: "kube-api-access-5h8tl") pod "80f3c94b-0bcd-4184-b450-5c9670e9598f" (UID: "80f3c94b-0bcd-4184-b450-5c9670e9598f"). InnerVolumeSpecName "kube-api-access-5h8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.027616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80f3c94b-0bcd-4184-b450-5c9670e9598f" (UID: "80f3c94b-0bcd-4184-b450-5c9670e9598f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.085839 4707 generic.go:334] "Generic (PLEG): container finished" podID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerID="15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f" exitCode=0 Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.085895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5lqp" event={"ID":"80f3c94b-0bcd-4184-b450-5c9670e9598f","Type":"ContainerDied","Data":"15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f"} Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.085905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5lqp" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.085926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5lqp" event={"ID":"80f3c94b-0bcd-4184-b450-5c9670e9598f","Type":"ContainerDied","Data":"12d41caf08a7d7c496f908683c78af9d204cf6fd739bb160a17dd9ec98ad469a"} Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.085964 4707 scope.go:117] "RemoveContainer" containerID="15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.104310 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.104374 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h8tl\" (UniqueName: \"kubernetes.io/projected/80f3c94b-0bcd-4184-b450-5c9670e9598f-kube-api-access-5h8tl\") on node \"crc\" DevicePath \"\"" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.104389 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f3c94b-0bcd-4184-b450-5c9670e9598f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.115100 4707 scope.go:117] "RemoveContainer" containerID="390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.117519 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5lqp"] Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.128825 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5lqp"] Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.140208 4707 scope.go:117] "RemoveContainer" containerID="fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.202943 4707 scope.go:117] "RemoveContainer" containerID="15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f" Feb 18 06:42:56 crc kubenswrapper[4707]: E0218 06:42:56.203811 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f\": container with ID starting with 15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f not found: ID does not exist" containerID="15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.203860 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f"} err="failed to get container status \"15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f\": rpc error: code = NotFound desc = could not find container \"15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f\": container with ID starting with 15ef3ebae11437deeac751ba45fe8352ce654cd55e45824c05ad2854675ae94f not found: ID does not exist" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.203896 4707 scope.go:117] "RemoveContainer" containerID="390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980" Feb 18 06:42:56 crc kubenswrapper[4707]: E0218 06:42:56.204313 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980\": container with ID starting with 390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980 not found: ID does not exist" containerID="390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.204341 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980"} err="failed to get container status \"390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980\": rpc error: code = NotFound desc = could not find container \"390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980\": container with ID starting with 390b33db9580353f777bd2d261f8680b89bae6c6c4ab1b15ed7465cdbffd6980 not found: ID does not exist" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.204358 4707 scope.go:117] "RemoveContainer" containerID="fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204" Feb 18 06:42:56 crc kubenswrapper[4707]: E0218 06:42:56.204648 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204\": container with ID starting with fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204 not found: ID does not exist" containerID="fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204" Feb 18 06:42:56 crc kubenswrapper[4707]: I0218 06:42:56.204673 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204"} err="failed to get container status \"fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204\": rpc error: code = NotFound desc = could not find container \"fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204\": container with ID starting with fe6e1a5f95de73061a0520f40159cd0c35855b59a440a631a79ead8aaef39204 not found: ID does not exist" Feb 18 06:42:58 crc kubenswrapper[4707]: I0218 06:42:58.073636 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" path="/var/lib/kubelet/pods/80f3c94b-0bcd-4184-b450-5c9670e9598f/volumes" Feb 18 06:43:21 crc kubenswrapper[4707]: I0218 06:43:21.382609 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:43:21 crc kubenswrapper[4707]: I0218 06:43:21.383210 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.382272 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.382733 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.382779 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.383522 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"800e6e55b8d062e998b7bb677fcc99ddd2ea1329b2dda358c0533660d33286f2"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.383577 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://800e6e55b8d062e998b7bb677fcc99ddd2ea1329b2dda358c0533660d33286f2" gracePeriod=600 Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.575208 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="800e6e55b8d062e998b7bb677fcc99ddd2ea1329b2dda358c0533660d33286f2" exitCode=0 Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.575270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"800e6e55b8d062e998b7bb677fcc99ddd2ea1329b2dda358c0533660d33286f2"} Feb 18 06:43:51 crc kubenswrapper[4707]: I0218 06:43:51.575327 4707 scope.go:117] "RemoveContainer" containerID="36c15f7f558c6d4048f3e758be953daa961af4a5a287b71ac6239f70736601fc" Feb 18 06:43:52 crc kubenswrapper[4707]: I0218 06:43:52.585958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2"} Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.432268 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7tjdr"] Feb 18 06:44:05 crc kubenswrapper[4707]: E0218 06:44:05.433384 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="extract-content" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.433402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="extract-content" Feb 18 06:44:05 crc kubenswrapper[4707]: E0218 06:44:05.433440 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="extract-utilities" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.433449 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="extract-utilities" Feb 18 06:44:05 crc kubenswrapper[4707]: E0218 06:44:05.433469 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="registry-server" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.433477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="registry-server" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.433723 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f3c94b-0bcd-4184-b450-5c9670e9598f" containerName="registry-server" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.437150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.467598 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tjdr"] Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.619876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-catalog-content\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.620332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-utilities\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.620418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8s8h\" (UniqueName: \"kubernetes.io/projected/049c4c64-de42-4371-aeae-38a5a4ffdd06-kube-api-access-r8s8h\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.721867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-utilities\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.721956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8s8h\" (UniqueName: \"kubernetes.io/projected/049c4c64-de42-4371-aeae-38a5a4ffdd06-kube-api-access-r8s8h\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.722040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-catalog-content\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.722476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-utilities\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.722505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-catalog-content\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.756355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8s8h\" (UniqueName: \"kubernetes.io/projected/049c4c64-de42-4371-aeae-38a5a4ffdd06-kube-api-access-r8s8h\") pod \"redhat-operators-7tjdr\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:05 crc kubenswrapper[4707]: I0218 06:44:05.759880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:06 crc kubenswrapper[4707]: I0218 06:44:06.269926 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tjdr"] Feb 18 06:44:06 crc kubenswrapper[4707]: I0218 06:44:06.700058 4707 generic.go:334] "Generic (PLEG): container finished" podID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerID="7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350" exitCode=0 Feb 18 06:44:06 crc kubenswrapper[4707]: I0218 06:44:06.700230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tjdr" event={"ID":"049c4c64-de42-4371-aeae-38a5a4ffdd06","Type":"ContainerDied","Data":"7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350"} Feb 18 06:44:06 crc kubenswrapper[4707]: I0218 06:44:06.700415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tjdr" event={"ID":"049c4c64-de42-4371-aeae-38a5a4ffdd06","Type":"ContainerStarted","Data":"6f89bd81a02fd7933b5177f32a0415f2b304f593651ee3269513fae4faa34020"} Feb 18 06:44:07 crc kubenswrapper[4707]: I0218 06:44:07.712109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tjdr" event={"ID":"049c4c64-de42-4371-aeae-38a5a4ffdd06","Type":"ContainerStarted","Data":"b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938"} Feb 18 06:44:12 crc kubenswrapper[4707]: I0218 06:44:12.752830 4707 generic.go:334] "Generic (PLEG): container finished" podID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerID="b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938" exitCode=0 Feb 18 06:44:12 crc kubenswrapper[4707]: I0218 06:44:12.752973 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tjdr" event={"ID":"049c4c64-de42-4371-aeae-38a5a4ffdd06","Type":"ContainerDied","Data":"b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938"} Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.091025 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xwdhh"] Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.093268 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.110271 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwdhh"] Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.129666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-utilities\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.129713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tff2c\" (UniqueName: \"kubernetes.io/projected/194ec371-bfcd-4917-b1e1-c98865415546-kube-api-access-tff2c\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.129986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-catalog-content\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.231526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-catalog-content\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.231621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-utilities\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.231645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tff2c\" (UniqueName: \"kubernetes.io/projected/194ec371-bfcd-4917-b1e1-c98865415546-kube-api-access-tff2c\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.232002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-catalog-content\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.232270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-utilities\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.262954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tff2c\" (UniqueName: \"kubernetes.io/projected/194ec371-bfcd-4917-b1e1-c98865415546-kube-api-access-tff2c\") pod \"certified-operators-xwdhh\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.412173 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.773404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tjdr" event={"ID":"049c4c64-de42-4371-aeae-38a5a4ffdd06","Type":"ContainerStarted","Data":"e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05"} Feb 18 06:44:14 crc kubenswrapper[4707]: I0218 06:44:14.827187 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7tjdr" podStartSLOduration=2.333030213 podStartE2EDuration="9.827165673s" podCreationTimestamp="2026-02-18 06:44:05 +0000 UTC" firstStartedPulling="2026-02-18 06:44:06.701721828 +0000 UTC m=+3383.349680972" lastFinishedPulling="2026-02-18 06:44:14.195857288 +0000 UTC m=+3390.843816432" observedRunningTime="2026-02-18 06:44:14.826234269 +0000 UTC m=+3391.474193403" watchObservedRunningTime="2026-02-18 06:44:14.827165673 +0000 UTC m=+3391.475124807" Feb 18 06:44:15 crc kubenswrapper[4707]: I0218 06:44:15.010127 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwdhh"] Feb 18 06:44:15 crc kubenswrapper[4707]: I0218 06:44:15.760980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:15 crc kubenswrapper[4707]: I0218 06:44:15.761308 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:15 crc kubenswrapper[4707]: I0218 06:44:15.783045 4707 generic.go:334] "Generic (PLEG): container finished" podID="194ec371-bfcd-4917-b1e1-c98865415546" containerID="8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983" exitCode=0 Feb 18 06:44:15 crc kubenswrapper[4707]: I0218 06:44:15.783090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwdhh" event={"ID":"194ec371-bfcd-4917-b1e1-c98865415546","Type":"ContainerDied","Data":"8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983"} Feb 18 06:44:15 crc kubenswrapper[4707]: I0218 06:44:15.783151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwdhh" event={"ID":"194ec371-bfcd-4917-b1e1-c98865415546","Type":"ContainerStarted","Data":"8b19518f33f08cb16c24af5f82aff8fa4d6c2adf8fa08911bbb66a35c9e92051"} Feb 18 06:44:16 crc kubenswrapper[4707]: I0218 06:44:16.823038 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7tjdr" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="registry-server" probeResult="failure" output=< Feb 18 06:44:16 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 06:44:16 crc kubenswrapper[4707]: > Feb 18 06:44:17 crc kubenswrapper[4707]: I0218 06:44:17.802237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwdhh" event={"ID":"194ec371-bfcd-4917-b1e1-c98865415546","Type":"ContainerStarted","Data":"9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab"} Feb 18 06:44:18 crc kubenswrapper[4707]: I0218 06:44:18.813010 4707 generic.go:334] "Generic (PLEG): container finished" podID="194ec371-bfcd-4917-b1e1-c98865415546" containerID="9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab" exitCode=0 Feb 18 06:44:18 crc kubenswrapper[4707]: I0218 06:44:18.813245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwdhh" event={"ID":"194ec371-bfcd-4917-b1e1-c98865415546","Type":"ContainerDied","Data":"9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab"} Feb 18 06:44:19 crc kubenswrapper[4707]: I0218 06:44:19.824734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwdhh" event={"ID":"194ec371-bfcd-4917-b1e1-c98865415546","Type":"ContainerStarted","Data":"83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011"} Feb 18 06:44:19 crc kubenswrapper[4707]: I0218 06:44:19.850370 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xwdhh" podStartSLOduration=2.383809083 podStartE2EDuration="5.85034973s" podCreationTimestamp="2026-02-18 06:44:14 +0000 UTC" firstStartedPulling="2026-02-18 06:44:15.784935274 +0000 UTC m=+3392.432894418" lastFinishedPulling="2026-02-18 06:44:19.251475931 +0000 UTC m=+3395.899435065" observedRunningTime="2026-02-18 06:44:19.843047783 +0000 UTC m=+3396.491006917" watchObservedRunningTime="2026-02-18 06:44:19.85034973 +0000 UTC m=+3396.498308864" Feb 18 06:44:24 crc kubenswrapper[4707]: I0218 06:44:24.413337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:24 crc kubenswrapper[4707]: I0218 06:44:24.414247 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:24 crc kubenswrapper[4707]: I0218 06:44:24.472064 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:24 crc kubenswrapper[4707]: I0218 06:44:24.922140 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:24 crc kubenswrapper[4707]: I0218 06:44:24.987997 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwdhh"] Feb 18 06:44:26 crc kubenswrapper[4707]: I0218 06:44:26.814541 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7tjdr" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="registry-server" probeResult="failure" output=< Feb 18 06:44:26 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 06:44:26 crc kubenswrapper[4707]: > Feb 18 06:44:26 crc kubenswrapper[4707]: I0218 06:44:26.881003 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xwdhh" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="registry-server" containerID="cri-o://83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011" gracePeriod=2 Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.727100 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.867777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-catalog-content\") pod \"194ec371-bfcd-4917-b1e1-c98865415546\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.867943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-utilities\") pod \"194ec371-bfcd-4917-b1e1-c98865415546\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.868003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tff2c\" (UniqueName: \"kubernetes.io/projected/194ec371-bfcd-4917-b1e1-c98865415546-kube-api-access-tff2c\") pod \"194ec371-bfcd-4917-b1e1-c98865415546\" (UID: \"194ec371-bfcd-4917-b1e1-c98865415546\") " Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.868857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-utilities" (OuterVolumeSpecName: "utilities") pod "194ec371-bfcd-4917-b1e1-c98865415546" (UID: "194ec371-bfcd-4917-b1e1-c98865415546"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.877502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194ec371-bfcd-4917-b1e1-c98865415546-kube-api-access-tff2c" (OuterVolumeSpecName: "kube-api-access-tff2c") pod "194ec371-bfcd-4917-b1e1-c98865415546" (UID: "194ec371-bfcd-4917-b1e1-c98865415546"). InnerVolumeSpecName "kube-api-access-tff2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.893950 4707 generic.go:334] "Generic (PLEG): container finished" podID="194ec371-bfcd-4917-b1e1-c98865415546" containerID="83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011" exitCode=0 Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.893991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwdhh" event={"ID":"194ec371-bfcd-4917-b1e1-c98865415546","Type":"ContainerDied","Data":"83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011"} Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.894022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwdhh" event={"ID":"194ec371-bfcd-4917-b1e1-c98865415546","Type":"ContainerDied","Data":"8b19518f33f08cb16c24af5f82aff8fa4d6c2adf8fa08911bbb66a35c9e92051"} Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.894040 4707 scope.go:117] "RemoveContainer" containerID="83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.894430 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwdhh" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.940613 4707 scope.go:117] "RemoveContainer" containerID="9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.943282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "194ec371-bfcd-4917-b1e1-c98865415546" (UID: "194ec371-bfcd-4917-b1e1-c98865415546"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.968735 4707 scope.go:117] "RemoveContainer" containerID="8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.970409 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.970571 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194ec371-bfcd-4917-b1e1-c98865415546-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:44:27 crc kubenswrapper[4707]: I0218 06:44:27.970582 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tff2c\" (UniqueName: \"kubernetes.io/projected/194ec371-bfcd-4917-b1e1-c98865415546-kube-api-access-tff2c\") on node \"crc\" DevicePath \"\"" Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.021787 4707 scope.go:117] "RemoveContainer" containerID="83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011" Feb 18 06:44:28 crc kubenswrapper[4707]: E0218 06:44:28.022859 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011\": container with ID starting with 83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011 not found: ID does not exist" containerID="83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011" Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.022936 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011"} err="failed to get container status \"83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011\": rpc error: code = NotFound desc = could not find container \"83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011\": container with ID starting with 83487b27dd18236bd6fb3df26cc8c158f16628f0b3fb8e6d491aa52450e18011 not found: ID does not exist" Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.022977 4707 scope.go:117] "RemoveContainer" containerID="9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab" Feb 18 06:44:28 crc kubenswrapper[4707]: E0218 06:44:28.023400 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab\": container with ID starting with 9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab not found: ID does not exist" containerID="9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab" Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.023440 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab"} err="failed to get container status \"9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab\": rpc error: code = NotFound desc = could not find container \"9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab\": container with ID starting with 9d030a54edcf25206028d0cdbfb2b737e193709570ea345ffa497e55309b64ab not found: ID does not exist" Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.023467 4707 scope.go:117] "RemoveContainer" containerID="8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983" Feb 18 06:44:28 crc kubenswrapper[4707]: E0218 06:44:28.023985 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983\": container with ID starting with 8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983 not found: ID does not exist" containerID="8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983" Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.024046 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983"} err="failed to get container status \"8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983\": rpc error: code = NotFound desc = could not find container \"8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983\": container with ID starting with 8fdd8eb5b1272d7c595a1bdd478bc5df830b41b6c176366f5f822c9bb062c983 not found: ID does not exist" Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.224047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwdhh"] Feb 18 06:44:28 crc kubenswrapper[4707]: I0218 06:44:28.233406 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xwdhh"] Feb 18 06:44:30 crc kubenswrapper[4707]: I0218 06:44:30.065345 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194ec371-bfcd-4917-b1e1-c98865415546" path="/var/lib/kubelet/pods/194ec371-bfcd-4917-b1e1-c98865415546/volumes" Feb 18 06:44:36 crc kubenswrapper[4707]: I0218 06:44:36.810533 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7tjdr" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="registry-server" probeResult="failure" output=< Feb 18 06:44:36 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 06:44:36 crc kubenswrapper[4707]: > Feb 18 06:44:45 crc kubenswrapper[4707]: I0218 06:44:45.815001 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:45 crc kubenswrapper[4707]: I0218 06:44:45.873095 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:46 crc kubenswrapper[4707]: I0218 06:44:46.066429 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tjdr"] Feb 18 06:44:47 crc kubenswrapper[4707]: I0218 06:44:47.092935 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7tjdr" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="registry-server" containerID="cri-o://e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05" gracePeriod=2 Feb 18 06:44:47 crc kubenswrapper[4707]: I0218 06:44:47.855924 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:47 crc kubenswrapper[4707]: I0218 06:44:47.999707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8s8h\" (UniqueName: \"kubernetes.io/projected/049c4c64-de42-4371-aeae-38a5a4ffdd06-kube-api-access-r8s8h\") pod \"049c4c64-de42-4371-aeae-38a5a4ffdd06\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " Feb 18 06:44:47 crc kubenswrapper[4707]: I0218 06:44:47.999885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-catalog-content\") pod \"049c4c64-de42-4371-aeae-38a5a4ffdd06\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " Feb 18 06:44:47 crc kubenswrapper[4707]: I0218 06:44:47.999953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-utilities\") pod \"049c4c64-de42-4371-aeae-38a5a4ffdd06\" (UID: \"049c4c64-de42-4371-aeae-38a5a4ffdd06\") " Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.000822 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-utilities" (OuterVolumeSpecName: "utilities") pod "049c4c64-de42-4371-aeae-38a5a4ffdd06" (UID: "049c4c64-de42-4371-aeae-38a5a4ffdd06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.006057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049c4c64-de42-4371-aeae-38a5a4ffdd06-kube-api-access-r8s8h" (OuterVolumeSpecName: "kube-api-access-r8s8h") pod "049c4c64-de42-4371-aeae-38a5a4ffdd06" (UID: "049c4c64-de42-4371-aeae-38a5a4ffdd06"). InnerVolumeSpecName "kube-api-access-r8s8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.104391 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.104433 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8s8h\" (UniqueName: \"kubernetes.io/projected/049c4c64-de42-4371-aeae-38a5a4ffdd06-kube-api-access-r8s8h\") on node \"crc\" DevicePath \"\"" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.111105 4707 generic.go:334] "Generic (PLEG): container finished" podID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerID="e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05" exitCode=0 Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.111145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tjdr" event={"ID":"049c4c64-de42-4371-aeae-38a5a4ffdd06","Type":"ContainerDied","Data":"e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05"} Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.111169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tjdr" event={"ID":"049c4c64-de42-4371-aeae-38a5a4ffdd06","Type":"ContainerDied","Data":"6f89bd81a02fd7933b5177f32a0415f2b304f593651ee3269513fae4faa34020"} Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.111188 4707 scope.go:117] "RemoveContainer" containerID="e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.111316 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tjdr" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.135465 4707 scope.go:117] "RemoveContainer" containerID="b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.158699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "049c4c64-de42-4371-aeae-38a5a4ffdd06" (UID: "049c4c64-de42-4371-aeae-38a5a4ffdd06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.169204 4707 scope.go:117] "RemoveContainer" containerID="7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.207012 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049c4c64-de42-4371-aeae-38a5a4ffdd06-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.213799 4707 scope.go:117] "RemoveContainer" containerID="e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05" Feb 18 06:44:48 crc kubenswrapper[4707]: E0218 06:44:48.214434 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05\": container with ID starting with e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05 not found: ID does not exist" containerID="e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.214482 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05"} err="failed to get container status \"e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05\": rpc error: code = NotFound desc = could not find container \"e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05\": container with ID starting with e7d2f68cbdc3f54623400043c26618d84345c765e2d9257e6a85883ff50b0d05 not found: ID does not exist" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.214528 4707 scope.go:117] "RemoveContainer" containerID="b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938" Feb 18 06:44:48 crc kubenswrapper[4707]: E0218 06:44:48.215200 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938\": container with ID starting with b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938 not found: ID does not exist" containerID="b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.215231 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938"} err="failed to get container status \"b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938\": rpc error: code = NotFound desc = could not find container \"b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938\": container with ID starting with b69019e92460314b3f2460cca56ca2396bde20b1990b1ea4ea67baff536a1938 not found: ID does not exist" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.215252 4707 scope.go:117] "RemoveContainer" containerID="7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350" Feb 18 06:44:48 crc kubenswrapper[4707]: E0218 06:44:48.215617 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350\": container with ID starting with 7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350 not found: ID does not exist" containerID="7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.215662 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350"} err="failed to get container status \"7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350\": rpc error: code = NotFound desc = could not find container \"7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350\": container with ID starting with 7974f4e29764df144c1fd27780bba0a91a4b09994271101c155e594ea09ca350 not found: ID does not exist" Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.447574 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tjdr"] Feb 18 06:44:48 crc kubenswrapper[4707]: I0218 06:44:48.456168 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7tjdr"] Feb 18 06:44:50 crc kubenswrapper[4707]: I0218 06:44:50.065878 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" path="/var/lib/kubelet/pods/049c4c64-de42-4371-aeae-38a5a4ffdd06/volumes" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.263555 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd"] Feb 18 06:45:00 crc kubenswrapper[4707]: E0218 06:45:00.264384 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="registry-server" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264396 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="registry-server" Feb 18 06:45:00 crc kubenswrapper[4707]: E0218 06:45:00.264410 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="extract-content" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264417 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="extract-content" Feb 18 06:45:00 crc kubenswrapper[4707]: E0218 06:45:00.264439 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="registry-server" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264446 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="registry-server" Feb 18 06:45:00 crc kubenswrapper[4707]: E0218 06:45:00.264465 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="extract-utilities" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264471 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="extract-utilities" Feb 18 06:45:00 crc kubenswrapper[4707]: E0218 06:45:00.264486 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="extract-content" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264493 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="extract-content" Feb 18 06:45:00 crc kubenswrapper[4707]: E0218 06:45:00.264501 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="extract-utilities" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264506 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="extract-utilities" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264716 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="049c4c64-de42-4371-aeae-38a5a4ffdd06" containerName="registry-server" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.264730 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="194ec371-bfcd-4917-b1e1-c98865415546" containerName="registry-server" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.265398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.267415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.271273 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.274181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd"] Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.346850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w85xm\" (UniqueName: \"kubernetes.io/projected/a7e02044-f406-485f-8df3-dcb3ae725825-kube-api-access-w85xm\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.346961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7e02044-f406-485f-8df3-dcb3ae725825-secret-volume\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.347178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7e02044-f406-485f-8df3-dcb3ae725825-config-volume\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.450327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7e02044-f406-485f-8df3-dcb3ae725825-config-volume\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.449444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7e02044-f406-485f-8df3-dcb3ae725825-config-volume\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.450465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w85xm\" (UniqueName: \"kubernetes.io/projected/a7e02044-f406-485f-8df3-dcb3ae725825-kube-api-access-w85xm\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.450566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7e02044-f406-485f-8df3-dcb3ae725825-secret-volume\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.458599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7e02044-f406-485f-8df3-dcb3ae725825-secret-volume\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.473445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w85xm\" (UniqueName: \"kubernetes.io/projected/a7e02044-f406-485f-8df3-dcb3ae725825-kube-api-access-w85xm\") pod \"collect-profiles-29523285-4hrhd\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:00 crc kubenswrapper[4707]: I0218 06:45:00.597652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:01 crc kubenswrapper[4707]: W0218 06:45:01.135601 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e02044_f406_485f_8df3_dcb3ae725825.slice/crio-ad6bdb0e0f9f63b473727d735d129bc4cf00de98cb12eaefb2bcff5060705bfb WatchSource:0}: Error finding container ad6bdb0e0f9f63b473727d735d129bc4cf00de98cb12eaefb2bcff5060705bfb: Status 404 returned error can't find the container with id ad6bdb0e0f9f63b473727d735d129bc4cf00de98cb12eaefb2bcff5060705bfb Feb 18 06:45:01 crc kubenswrapper[4707]: I0218 06:45:01.138086 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd"] Feb 18 06:45:01 crc kubenswrapper[4707]: I0218 06:45:01.233682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" event={"ID":"a7e02044-f406-485f-8df3-dcb3ae725825","Type":"ContainerStarted","Data":"ad6bdb0e0f9f63b473727d735d129bc4cf00de98cb12eaefb2bcff5060705bfb"} Feb 18 06:45:02 crc kubenswrapper[4707]: I0218 06:45:02.244972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" event={"ID":"a7e02044-f406-485f-8df3-dcb3ae725825","Type":"ContainerStarted","Data":"4a0888e1f2ac9943cf288585276140e0e5bd29426f6417d73cf86a3ec84aba5b"} Feb 18 06:45:02 crc kubenswrapper[4707]: I0218 06:45:02.269020 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" podStartSLOduration=2.268991003 podStartE2EDuration="2.268991003s" podCreationTimestamp="2026-02-18 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 06:45:02.259681091 +0000 UTC m=+3438.907640255" watchObservedRunningTime="2026-02-18 06:45:02.268991003 +0000 UTC m=+3438.916950157" Feb 18 06:45:03 crc kubenswrapper[4707]: I0218 06:45:03.257141 4707 generic.go:334] "Generic (PLEG): container finished" podID="a7e02044-f406-485f-8df3-dcb3ae725825" containerID="4a0888e1f2ac9943cf288585276140e0e5bd29426f6417d73cf86a3ec84aba5b" exitCode=0 Feb 18 06:45:03 crc kubenswrapper[4707]: I0218 06:45:03.257215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" event={"ID":"a7e02044-f406-485f-8df3-dcb3ae725825","Type":"ContainerDied","Data":"4a0888e1f2ac9943cf288585276140e0e5bd29426f6417d73cf86a3ec84aba5b"} Feb 18 06:45:04 crc kubenswrapper[4707]: I0218 06:45:04.925609 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:04 crc kubenswrapper[4707]: I0218 06:45:04.940197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w85xm\" (UniqueName: \"kubernetes.io/projected/a7e02044-f406-485f-8df3-dcb3ae725825-kube-api-access-w85xm\") pod \"a7e02044-f406-485f-8df3-dcb3ae725825\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " Feb 18 06:45:04 crc kubenswrapper[4707]: I0218 06:45:04.940263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7e02044-f406-485f-8df3-dcb3ae725825-config-volume\") pod \"a7e02044-f406-485f-8df3-dcb3ae725825\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " Feb 18 06:45:04 crc kubenswrapper[4707]: I0218 06:45:04.940396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7e02044-f406-485f-8df3-dcb3ae725825-secret-volume\") pod \"a7e02044-f406-485f-8df3-dcb3ae725825\" (UID: \"a7e02044-f406-485f-8df3-dcb3ae725825\") " Feb 18 06:45:04 crc kubenswrapper[4707]: I0218 06:45:04.941910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7e02044-f406-485f-8df3-dcb3ae725825-config-volume" (OuterVolumeSpecName: "config-volume") pod "a7e02044-f406-485f-8df3-dcb3ae725825" (UID: "a7e02044-f406-485f-8df3-dcb3ae725825"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 06:45:04 crc kubenswrapper[4707]: I0218 06:45:04.946996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e02044-f406-485f-8df3-dcb3ae725825-kube-api-access-w85xm" (OuterVolumeSpecName: "kube-api-access-w85xm") pod "a7e02044-f406-485f-8df3-dcb3ae725825" (UID: "a7e02044-f406-485f-8df3-dcb3ae725825"). InnerVolumeSpecName "kube-api-access-w85xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:45:04 crc kubenswrapper[4707]: I0218 06:45:04.948990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7e02044-f406-485f-8df3-dcb3ae725825-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a7e02044-f406-485f-8df3-dcb3ae725825" (UID: "a7e02044-f406-485f-8df3-dcb3ae725825"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.042649 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7e02044-f406-485f-8df3-dcb3ae725825-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.042694 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7e02044-f406-485f-8df3-dcb3ae725825-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.042709 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w85xm\" (UniqueName: \"kubernetes.io/projected/a7e02044-f406-485f-8df3-dcb3ae725825-kube-api-access-w85xm\") on node \"crc\" DevicePath \"\"" Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.274917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" event={"ID":"a7e02044-f406-485f-8df3-dcb3ae725825","Type":"ContainerDied","Data":"ad6bdb0e0f9f63b473727d735d129bc4cf00de98cb12eaefb2bcff5060705bfb"} Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.274957 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad6bdb0e0f9f63b473727d735d129bc4cf00de98cb12eaefb2bcff5060705bfb" Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.275267 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523285-4hrhd" Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.348327 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98"] Feb 18 06:45:05 crc kubenswrapper[4707]: I0218 06:45:05.356545 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523240-clj98"] Feb 18 06:45:06 crc kubenswrapper[4707]: I0218 06:45:06.069179 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689c82fa-0c0f-44ad-bb51-cf3769313da5" path="/var/lib/kubelet/pods/689c82fa-0c0f-44ad-bb51-cf3769313da5/volumes" Feb 18 06:45:50 crc kubenswrapper[4707]: I0218 06:45:50.240233 4707 scope.go:117] "RemoveContainer" containerID="a71b2691e9a4be1cc9f471fa6a8055333e2bdd69f90cb6711ecb5def43d1abb4" Feb 18 06:45:51 crc kubenswrapper[4707]: I0218 06:45:51.382525 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:45:51 crc kubenswrapper[4707]: I0218 06:45:51.382943 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:46:21 crc kubenswrapper[4707]: I0218 06:46:21.382978 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:46:21 crc kubenswrapper[4707]: I0218 06:46:21.383636 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:46:51 crc kubenswrapper[4707]: I0218 06:46:51.382200 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:46:51 crc kubenswrapper[4707]: I0218 06:46:51.382746 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:46:51 crc kubenswrapper[4707]: I0218 06:46:51.382809 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:46:51 crc kubenswrapper[4707]: I0218 06:46:51.383531 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:46:51 crc kubenswrapper[4707]: I0218 06:46:51.383588 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" gracePeriod=600 Feb 18 06:46:51 crc kubenswrapper[4707]: E0218 06:46:51.504010 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:46:52 crc kubenswrapper[4707]: I0218 06:46:52.263774 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" exitCode=0 Feb 18 06:46:52 crc kubenswrapper[4707]: I0218 06:46:52.263952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2"} Feb 18 06:46:52 crc kubenswrapper[4707]: I0218 06:46:52.264186 4707 scope.go:117] "RemoveContainer" containerID="800e6e55b8d062e998b7bb677fcc99ddd2ea1329b2dda358c0533660d33286f2" Feb 18 06:46:52 crc kubenswrapper[4707]: I0218 06:46:52.264896 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:46:52 crc kubenswrapper[4707]: E0218 06:46:52.265139 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:47:06 crc kubenswrapper[4707]: I0218 06:47:06.053676 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:47:06 crc kubenswrapper[4707]: E0218 06:47:06.054521 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:47:19 crc kubenswrapper[4707]: I0218 06:47:19.053327 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:47:19 crc kubenswrapper[4707]: E0218 06:47:19.054400 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:47:33 crc kubenswrapper[4707]: I0218 06:47:33.053924 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:47:33 crc kubenswrapper[4707]: E0218 06:47:33.054727 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:47:46 crc kubenswrapper[4707]: I0218 06:47:46.053768 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:47:46 crc kubenswrapper[4707]: E0218 06:47:46.054746 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:48:01 crc kubenswrapper[4707]: I0218 06:48:01.060160 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:48:01 crc kubenswrapper[4707]: E0218 06:48:01.061755 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:48:15 crc kubenswrapper[4707]: I0218 06:48:15.053863 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:48:15 crc kubenswrapper[4707]: E0218 06:48:15.054991 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:48:29 crc kubenswrapper[4707]: I0218 06:48:29.054554 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:48:29 crc kubenswrapper[4707]: E0218 06:48:29.055434 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:48:43 crc kubenswrapper[4707]: I0218 06:48:43.052915 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:48:43 crc kubenswrapper[4707]: E0218 06:48:43.053726 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:48:58 crc kubenswrapper[4707]: I0218 06:48:58.053245 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:48:58 crc kubenswrapper[4707]: E0218 06:48:58.054032 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.251690 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qqwhd"] Feb 18 06:49:08 crc kubenswrapper[4707]: E0218 06:49:08.252711 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e02044-f406-485f-8df3-dcb3ae725825" containerName="collect-profiles" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.252731 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e02044-f406-485f-8df3-dcb3ae725825" containerName="collect-profiles" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.252982 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e02044-f406-485f-8df3-dcb3ae725825" containerName="collect-profiles" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.254630 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.269923 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqwhd"] Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.370530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kggs\" (UniqueName: \"kubernetes.io/projected/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-kube-api-access-5kggs\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.371055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-utilities\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.371124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-catalog-content\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.473248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-utilities\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.473329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-catalog-content\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.473408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kggs\" (UniqueName: \"kubernetes.io/projected/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-kube-api-access-5kggs\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.473760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-catalog-content\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.473917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-utilities\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.496865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kggs\" (UniqueName: \"kubernetes.io/projected/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-kube-api-access-5kggs\") pod \"community-operators-qqwhd\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:08 crc kubenswrapper[4707]: I0218 06:49:08.583775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:09 crc kubenswrapper[4707]: I0218 06:49:09.717813 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqwhd"] Feb 18 06:49:10 crc kubenswrapper[4707]: I0218 06:49:10.579519 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerID="23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1" exitCode=0 Feb 18 06:49:10 crc kubenswrapper[4707]: I0218 06:49:10.579620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwhd" event={"ID":"9ccca46f-5404-4dbc-80ea-15a6d9fd676d","Type":"ContainerDied","Data":"23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1"} Feb 18 06:49:10 crc kubenswrapper[4707]: I0218 06:49:10.579889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwhd" event={"ID":"9ccca46f-5404-4dbc-80ea-15a6d9fd676d","Type":"ContainerStarted","Data":"aac8c1cd9dfd4db287bd59dfd26feb83840c943e9875a6cd9efe39f00cfd5045"} Feb 18 06:49:10 crc kubenswrapper[4707]: I0218 06:49:10.581741 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:49:11 crc kubenswrapper[4707]: I0218 06:49:11.053889 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:49:11 crc kubenswrapper[4707]: E0218 06:49:11.054249 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:49:11 crc kubenswrapper[4707]: I0218 06:49:11.591043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwhd" event={"ID":"9ccca46f-5404-4dbc-80ea-15a6d9fd676d","Type":"ContainerStarted","Data":"604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859"} Feb 18 06:49:13 crc kubenswrapper[4707]: I0218 06:49:13.611241 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerID="604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859" exitCode=0 Feb 18 06:49:13 crc kubenswrapper[4707]: I0218 06:49:13.611348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwhd" event={"ID":"9ccca46f-5404-4dbc-80ea-15a6d9fd676d","Type":"ContainerDied","Data":"604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859"} Feb 18 06:49:14 crc kubenswrapper[4707]: I0218 06:49:14.626846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwhd" event={"ID":"9ccca46f-5404-4dbc-80ea-15a6d9fd676d","Type":"ContainerStarted","Data":"5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017"} Feb 18 06:49:14 crc kubenswrapper[4707]: I0218 06:49:14.663266 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qqwhd" podStartSLOduration=3.236859493 podStartE2EDuration="6.663239595s" podCreationTimestamp="2026-02-18 06:49:08 +0000 UTC" firstStartedPulling="2026-02-18 06:49:10.581516098 +0000 UTC m=+3687.229475232" lastFinishedPulling="2026-02-18 06:49:14.0078962 +0000 UTC m=+3690.655855334" observedRunningTime="2026-02-18 06:49:14.649625776 +0000 UTC m=+3691.297584910" watchObservedRunningTime="2026-02-18 06:49:14.663239595 +0000 UTC m=+3691.311198729" Feb 18 06:49:18 crc kubenswrapper[4707]: I0218 06:49:18.584683 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:18 crc kubenswrapper[4707]: I0218 06:49:18.585423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:18 crc kubenswrapper[4707]: I0218 06:49:18.640886 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:25 crc kubenswrapper[4707]: I0218 06:49:25.053332 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:49:25 crc kubenswrapper[4707]: E0218 06:49:25.054112 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:49:28 crc kubenswrapper[4707]: I0218 06:49:28.644501 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:28 crc kubenswrapper[4707]: I0218 06:49:28.698332 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqwhd"] Feb 18 06:49:28 crc kubenswrapper[4707]: I0218 06:49:28.746947 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qqwhd" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="registry-server" containerID="cri-o://5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017" gracePeriod=2 Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.581059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.738087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-catalog-content\") pod \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.739249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kggs\" (UniqueName: \"kubernetes.io/projected/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-kube-api-access-5kggs\") pod \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.739326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-utilities\") pod \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\" (UID: \"9ccca46f-5404-4dbc-80ea-15a6d9fd676d\") " Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.739887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-utilities" (OuterVolumeSpecName: "utilities") pod "9ccca46f-5404-4dbc-80ea-15a6d9fd676d" (UID: "9ccca46f-5404-4dbc-80ea-15a6d9fd676d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.740050 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.747182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-kube-api-access-5kggs" (OuterVolumeSpecName: "kube-api-access-5kggs") pod "9ccca46f-5404-4dbc-80ea-15a6d9fd676d" (UID: "9ccca46f-5404-4dbc-80ea-15a6d9fd676d"). InnerVolumeSpecName "kube-api-access-5kggs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.757388 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerID="5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017" exitCode=0 Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.757451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwhd" event={"ID":"9ccca46f-5404-4dbc-80ea-15a6d9fd676d","Type":"ContainerDied","Data":"5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017"} Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.757484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwhd" event={"ID":"9ccca46f-5404-4dbc-80ea-15a6d9fd676d","Type":"ContainerDied","Data":"aac8c1cd9dfd4db287bd59dfd26feb83840c943e9875a6cd9efe39f00cfd5045"} Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.757504 4707 scope.go:117] "RemoveContainer" containerID="5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.757651 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwhd" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.797406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ccca46f-5404-4dbc-80ea-15a6d9fd676d" (UID: "9ccca46f-5404-4dbc-80ea-15a6d9fd676d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.821328 4707 scope.go:117] "RemoveContainer" containerID="604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.841905 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.841932 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kggs\" (UniqueName: \"kubernetes.io/projected/9ccca46f-5404-4dbc-80ea-15a6d9fd676d-kube-api-access-5kggs\") on node \"crc\" DevicePath \"\"" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.844507 4707 scope.go:117] "RemoveContainer" containerID="23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.898080 4707 scope.go:117] "RemoveContainer" containerID="5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017" Feb 18 06:49:29 crc kubenswrapper[4707]: E0218 06:49:29.898546 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017\": container with ID starting with 5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017 not found: ID does not exist" containerID="5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.898592 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017"} err="failed to get container status \"5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017\": rpc error: code = NotFound desc = could not find container \"5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017\": container with ID starting with 5bbba68115fd01271d21ac9d0e146c2955865eadd74c99394d7d9d130ab98017 not found: ID does not exist" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.898627 4707 scope.go:117] "RemoveContainer" containerID="604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859" Feb 18 06:49:29 crc kubenswrapper[4707]: E0218 06:49:29.899232 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859\": container with ID starting with 604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859 not found: ID does not exist" containerID="604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.899267 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859"} err="failed to get container status \"604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859\": rpc error: code = NotFound desc = could not find container \"604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859\": container with ID starting with 604b1fb073d6b24b3553f579008b43fc2247f7f3285e4063b71e43a21f7dd859 not found: ID does not exist" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.899288 4707 scope.go:117] "RemoveContainer" containerID="23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1" Feb 18 06:49:29 crc kubenswrapper[4707]: E0218 06:49:29.899759 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1\": container with ID starting with 23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1 not found: ID does not exist" containerID="23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1" Feb 18 06:49:29 crc kubenswrapper[4707]: I0218 06:49:29.899813 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1"} err="failed to get container status \"23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1\": rpc error: code = NotFound desc = could not find container \"23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1\": container with ID starting with 23b184bcfb8d177fade8cc94f9f239c71c12f7f73cf0578d8e8ff5b9836796d1 not found: ID does not exist" Feb 18 06:49:30 crc kubenswrapper[4707]: I0218 06:49:30.098748 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqwhd"] Feb 18 06:49:30 crc kubenswrapper[4707]: I0218 06:49:30.107035 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qqwhd"] Feb 18 06:49:32 crc kubenswrapper[4707]: I0218 06:49:32.064412 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" path="/var/lib/kubelet/pods/9ccca46f-5404-4dbc-80ea-15a6d9fd676d/volumes" Feb 18 06:49:38 crc kubenswrapper[4707]: I0218 06:49:38.053961 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:49:38 crc kubenswrapper[4707]: E0218 06:49:38.054769 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:49:52 crc kubenswrapper[4707]: I0218 06:49:52.053773 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:49:52 crc kubenswrapper[4707]: E0218 06:49:52.054529 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:50:06 crc kubenswrapper[4707]: I0218 06:50:06.052734 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:50:06 crc kubenswrapper[4707]: E0218 06:50:06.053687 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:50:19 crc kubenswrapper[4707]: I0218 06:50:19.053268 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:50:19 crc kubenswrapper[4707]: E0218 06:50:19.055861 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:50:32 crc kubenswrapper[4707]: I0218 06:50:32.056467 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:50:32 crc kubenswrapper[4707]: E0218 06:50:32.057376 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:50:46 crc kubenswrapper[4707]: I0218 06:50:46.052921 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:50:46 crc kubenswrapper[4707]: E0218 06:50:46.053820 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:50:59 crc kubenswrapper[4707]: I0218 06:50:59.053874 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:50:59 crc kubenswrapper[4707]: E0218 06:50:59.054727 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:51:10 crc kubenswrapper[4707]: I0218 06:51:10.053945 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:51:10 crc kubenswrapper[4707]: E0218 06:51:10.054743 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:51:25 crc kubenswrapper[4707]: I0218 06:51:25.053775 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:51:25 crc kubenswrapper[4707]: E0218 06:51:25.054758 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:51:39 crc kubenswrapper[4707]: I0218 06:51:39.054187 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:51:39 crc kubenswrapper[4707]: E0218 06:51:39.055067 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:51:54 crc kubenswrapper[4707]: I0218 06:51:54.065110 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:51:55 crc kubenswrapper[4707]: I0218 06:51:55.003753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"9fe239756b1524d5039921de98375e95766bc6e0c5eee5de9059af464be56f58"} Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.330322 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd8s"] Feb 18 06:53:41 crc kubenswrapper[4707]: E0218 06:53:41.331281 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="extract-content" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.331293 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="extract-content" Feb 18 06:53:41 crc kubenswrapper[4707]: E0218 06:53:41.331316 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="extract-utilities" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.331322 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="extract-utilities" Feb 18 06:53:41 crc kubenswrapper[4707]: E0218 06:53:41.331351 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="registry-server" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.331358 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="registry-server" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.331567 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccca46f-5404-4dbc-80ea-15a6d9fd676d" containerName="registry-server" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.332905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.338032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jj9\" (UniqueName: \"kubernetes.io/projected/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-kube-api-access-76jj9\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.338138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-utilities\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.338173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-catalog-content\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.345434 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd8s"] Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.440304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-utilities\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.440380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-catalog-content\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.440540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76jj9\" (UniqueName: \"kubernetes.io/projected/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-kube-api-access-76jj9\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.440774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-utilities\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.440917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-catalog-content\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.466249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jj9\" (UniqueName: \"kubernetes.io/projected/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-kube-api-access-76jj9\") pod \"redhat-marketplace-mdd8s\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:41 crc kubenswrapper[4707]: I0218 06:53:41.667674 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:42 crc kubenswrapper[4707]: I0218 06:53:42.182992 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd8s"] Feb 18 06:53:42 crc kubenswrapper[4707]: I0218 06:53:42.991934 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerID="ff829f8000d0e8471def10388b2e102e8f7c18a7f6e4164899ca4bfce93d4b21" exitCode=0 Feb 18 06:53:42 crc kubenswrapper[4707]: I0218 06:53:42.992037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd8s" event={"ID":"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e","Type":"ContainerDied","Data":"ff829f8000d0e8471def10388b2e102e8f7c18a7f6e4164899ca4bfce93d4b21"} Feb 18 06:53:42 crc kubenswrapper[4707]: I0218 06:53:42.992322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd8s" event={"ID":"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e","Type":"ContainerStarted","Data":"a1c49b30b4d2f950f7b4420d63c46d75a01e49c6f0dec4ef736d872041d202d5"} Feb 18 06:53:44 crc kubenswrapper[4707]: I0218 06:53:44.001091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd8s" event={"ID":"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e","Type":"ContainerStarted","Data":"c45539328481ead1ebbe5831efcb786619be23111dce5fb8bb2215b9a9da5ea0"} Feb 18 06:53:45 crc kubenswrapper[4707]: I0218 06:53:45.012742 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerID="c45539328481ead1ebbe5831efcb786619be23111dce5fb8bb2215b9a9da5ea0" exitCode=0 Feb 18 06:53:45 crc kubenswrapper[4707]: I0218 06:53:45.012841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd8s" event={"ID":"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e","Type":"ContainerDied","Data":"c45539328481ead1ebbe5831efcb786619be23111dce5fb8bb2215b9a9da5ea0"} Feb 18 06:53:46 crc kubenswrapper[4707]: I0218 06:53:46.024585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd8s" event={"ID":"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e","Type":"ContainerStarted","Data":"db5efe911000663577b522f9aa664f61961427733325672b1c3c6cb3641f67c1"} Feb 18 06:53:46 crc kubenswrapper[4707]: I0218 06:53:46.050109 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdd8s" podStartSLOduration=2.617500811 podStartE2EDuration="5.050088172s" podCreationTimestamp="2026-02-18 06:53:41 +0000 UTC" firstStartedPulling="2026-02-18 06:53:42.99401395 +0000 UTC m=+3959.641973084" lastFinishedPulling="2026-02-18 06:53:45.426601311 +0000 UTC m=+3962.074560445" observedRunningTime="2026-02-18 06:53:46.044557802 +0000 UTC m=+3962.692516936" watchObservedRunningTime="2026-02-18 06:53:46.050088172 +0000 UTC m=+3962.698047306" Feb 18 06:53:51 crc kubenswrapper[4707]: I0218 06:53:51.668868 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:51 crc kubenswrapper[4707]: I0218 06:53:51.669445 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:51 crc kubenswrapper[4707]: I0218 06:53:51.716336 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:52 crc kubenswrapper[4707]: I0218 06:53:52.120862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:52 crc kubenswrapper[4707]: I0218 06:53:52.195781 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd8s"] Feb 18 06:53:54 crc kubenswrapper[4707]: I0218 06:53:54.083872 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mdd8s" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="registry-server" containerID="cri-o://db5efe911000663577b522f9aa664f61961427733325672b1c3c6cb3641f67c1" gracePeriod=2 Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.095220 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerID="db5efe911000663577b522f9aa664f61961427733325672b1c3c6cb3641f67c1" exitCode=0 Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.095309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd8s" event={"ID":"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e","Type":"ContainerDied","Data":"db5efe911000663577b522f9aa664f61961427733325672b1c3c6cb3641f67c1"} Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.278764 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.359809 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76jj9\" (UniqueName: \"kubernetes.io/projected/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-kube-api-access-76jj9\") pod \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.359943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-utilities\") pod \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.359970 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-catalog-content\") pod \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\" (UID: \"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e\") " Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.360629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-utilities" (OuterVolumeSpecName: "utilities") pod "b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" (UID: "b8ee834e-e9e1-4b24-abcd-0d624ad1b44e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.365608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-kube-api-access-76jj9" (OuterVolumeSpecName: "kube-api-access-76jj9") pod "b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" (UID: "b8ee834e-e9e1-4b24-abcd-0d624ad1b44e"). InnerVolumeSpecName "kube-api-access-76jj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.386900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" (UID: "b8ee834e-e9e1-4b24-abcd-0d624ad1b44e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.462549 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.462584 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:53:55 crc kubenswrapper[4707]: I0218 06:53:55.462597 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76jj9\" (UniqueName: \"kubernetes.io/projected/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e-kube-api-access-76jj9\") on node \"crc\" DevicePath \"\"" Feb 18 06:53:56 crc kubenswrapper[4707]: I0218 06:53:56.129745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd8s" event={"ID":"b8ee834e-e9e1-4b24-abcd-0d624ad1b44e","Type":"ContainerDied","Data":"a1c49b30b4d2f950f7b4420d63c46d75a01e49c6f0dec4ef736d872041d202d5"} Feb 18 06:53:56 crc kubenswrapper[4707]: I0218 06:53:56.129842 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd8s" Feb 18 06:53:56 crc kubenswrapper[4707]: I0218 06:53:56.130177 4707 scope.go:117] "RemoveContainer" containerID="db5efe911000663577b522f9aa664f61961427733325672b1c3c6cb3641f67c1" Feb 18 06:53:56 crc kubenswrapper[4707]: I0218 06:53:56.161861 4707 scope.go:117] "RemoveContainer" containerID="c45539328481ead1ebbe5831efcb786619be23111dce5fb8bb2215b9a9da5ea0" Feb 18 06:53:56 crc kubenswrapper[4707]: I0218 06:53:56.161973 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd8s"] Feb 18 06:53:56 crc kubenswrapper[4707]: I0218 06:53:56.172674 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd8s"] Feb 18 06:53:56 crc kubenswrapper[4707]: I0218 06:53:56.659904 4707 scope.go:117] "RemoveContainer" containerID="ff829f8000d0e8471def10388b2e102e8f7c18a7f6e4164899ca4bfce93d4b21" Feb 18 06:53:58 crc kubenswrapper[4707]: I0218 06:53:58.063777 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" path="/var/lib/kubelet/pods/b8ee834e-e9e1-4b24-abcd-0d624ad1b44e/volumes" Feb 18 06:54:21 crc kubenswrapper[4707]: I0218 06:54:21.382862 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:54:21 crc kubenswrapper[4707]: I0218 06:54:21.383459 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.568227 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wx9bd"] Feb 18 06:54:39 crc kubenswrapper[4707]: E0218 06:54:39.569169 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="extract-utilities" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.569184 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="extract-utilities" Feb 18 06:54:39 crc kubenswrapper[4707]: E0218 06:54:39.569210 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="registry-server" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.569216 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="registry-server" Feb 18 06:54:39 crc kubenswrapper[4707]: E0218 06:54:39.569230 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="extract-content" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.569236 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="extract-content" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.569470 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ee834e-e9e1-4b24-abcd-0d624ad1b44e" containerName="registry-server" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.570773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.639479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wx9bd"] Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.691768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpg4g\" (UniqueName: \"kubernetes.io/projected/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-kube-api-access-zpg4g\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.691844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-utilities\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.691917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-catalog-content\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.793363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpg4g\" (UniqueName: \"kubernetes.io/projected/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-kube-api-access-zpg4g\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.793413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-utilities\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.793463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-catalog-content\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.793956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-catalog-content\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.794023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-utilities\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.821908 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpg4g\" (UniqueName: \"kubernetes.io/projected/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-kube-api-access-zpg4g\") pod \"redhat-operators-wx9bd\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:39 crc kubenswrapper[4707]: I0218 06:54:39.890170 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:40 crc kubenswrapper[4707]: I0218 06:54:40.432391 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wx9bd"] Feb 18 06:54:40 crc kubenswrapper[4707]: I0218 06:54:40.544481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wx9bd" event={"ID":"8065e0b0-ab41-415c-a16e-fa5da8c8ce96","Type":"ContainerStarted","Data":"8a0afb30071ae7ef254d89b4745a012700086187f0be1e52bca873af5d11d18d"} Feb 18 06:54:41 crc kubenswrapper[4707]: I0218 06:54:41.562150 4707 generic.go:334] "Generic (PLEG): container finished" podID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerID="0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70" exitCode=0 Feb 18 06:54:41 crc kubenswrapper[4707]: I0218 06:54:41.562643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wx9bd" event={"ID":"8065e0b0-ab41-415c-a16e-fa5da8c8ce96","Type":"ContainerDied","Data":"0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70"} Feb 18 06:54:41 crc kubenswrapper[4707]: I0218 06:54:41.565496 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 06:54:42 crc kubenswrapper[4707]: I0218 06:54:42.573352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wx9bd" event={"ID":"8065e0b0-ab41-415c-a16e-fa5da8c8ce96","Type":"ContainerStarted","Data":"f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb"} Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.672389 4707 generic.go:334] "Generic (PLEG): container finished" podID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerID="f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb" exitCode=0 Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.672493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wx9bd" event={"ID":"8065e0b0-ab41-415c-a16e-fa5da8c8ce96","Type":"ContainerDied","Data":"f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb"} Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.823594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rplpk"] Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.825953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.856909 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rplpk"] Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.913748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-utilities\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.913826 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rtb\" (UniqueName: \"kubernetes.io/projected/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-kube-api-access-56rtb\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:49 crc kubenswrapper[4707]: I0218 06:54:49.914311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-catalog-content\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.016371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-catalog-content\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.016478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-utilities\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.016506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rtb\" (UniqueName: \"kubernetes.io/projected/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-kube-api-access-56rtb\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.017491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-catalog-content\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.017548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-utilities\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.044651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rtb\" (UniqueName: \"kubernetes.io/projected/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-kube-api-access-56rtb\") pod \"certified-operators-rplpk\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.143245 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:54:50 crc kubenswrapper[4707]: W0218 06:54:50.640483 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8804b7f_8076_4327_8d6f_9b7cbf8b0f98.slice/crio-c990d1886cbcaeb83885c8e41b313226c823e541259e2013dcac0d9d2f57fb76 WatchSource:0}: Error finding container c990d1886cbcaeb83885c8e41b313226c823e541259e2013dcac0d9d2f57fb76: Status 404 returned error can't find the container with id c990d1886cbcaeb83885c8e41b313226c823e541259e2013dcac0d9d2f57fb76 Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.643988 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rplpk"] Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.683170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rplpk" event={"ID":"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98","Type":"ContainerStarted","Data":"c990d1886cbcaeb83885c8e41b313226c823e541259e2013dcac0d9d2f57fb76"} Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.686749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wx9bd" event={"ID":"8065e0b0-ab41-415c-a16e-fa5da8c8ce96","Type":"ContainerStarted","Data":"7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6"} Feb 18 06:54:50 crc kubenswrapper[4707]: I0218 06:54:50.714197 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wx9bd" podStartSLOduration=3.247813295 podStartE2EDuration="11.714173438s" podCreationTimestamp="2026-02-18 06:54:39 +0000 UTC" firstStartedPulling="2026-02-18 06:54:41.565303569 +0000 UTC m=+4018.213262703" lastFinishedPulling="2026-02-18 06:54:50.031663702 +0000 UTC m=+4026.679622846" observedRunningTime="2026-02-18 06:54:50.712830673 +0000 UTC m=+4027.360789797" watchObservedRunningTime="2026-02-18 06:54:50.714173438 +0000 UTC m=+4027.362132572" Feb 18 06:54:51 crc kubenswrapper[4707]: I0218 06:54:51.382393 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:54:51 crc kubenswrapper[4707]: I0218 06:54:51.382697 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:54:51 crc kubenswrapper[4707]: I0218 06:54:51.695724 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerID="202cdb729e9dbc70a5427b4e8c7389f293646dd59e0116fd6b704a875a02ed9d" exitCode=0 Feb 18 06:54:51 crc kubenswrapper[4707]: I0218 06:54:51.695850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rplpk" event={"ID":"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98","Type":"ContainerDied","Data":"202cdb729e9dbc70a5427b4e8c7389f293646dd59e0116fd6b704a875a02ed9d"} Feb 18 06:54:52 crc kubenswrapper[4707]: I0218 06:54:52.707670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rplpk" event={"ID":"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98","Type":"ContainerStarted","Data":"b746c4720498dd80eef6363ca35a09e2f9a36281f8d7782495bc0bb05d0bf4c9"} Feb 18 06:54:55 crc kubenswrapper[4707]: I0218 06:54:55.741073 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerID="b746c4720498dd80eef6363ca35a09e2f9a36281f8d7782495bc0bb05d0bf4c9" exitCode=0 Feb 18 06:54:55 crc kubenswrapper[4707]: I0218 06:54:55.741133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rplpk" event={"ID":"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98","Type":"ContainerDied","Data":"b746c4720498dd80eef6363ca35a09e2f9a36281f8d7782495bc0bb05d0bf4c9"} Feb 18 06:54:56 crc kubenswrapper[4707]: I0218 06:54:56.753394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rplpk" event={"ID":"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98","Type":"ContainerStarted","Data":"c3ee8ee112d8e5f0e37fa7f92fc7ac8b4ac8fa8f2bdd91154e115024a0675a8d"} Feb 18 06:54:56 crc kubenswrapper[4707]: I0218 06:54:56.771083 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rplpk" podStartSLOduration=3.229932272 podStartE2EDuration="7.771066396s" podCreationTimestamp="2026-02-18 06:54:49 +0000 UTC" firstStartedPulling="2026-02-18 06:54:51.698176284 +0000 UTC m=+4028.346135418" lastFinishedPulling="2026-02-18 06:54:56.239310408 +0000 UTC m=+4032.887269542" observedRunningTime="2026-02-18 06:54:56.770941832 +0000 UTC m=+4033.418900966" watchObservedRunningTime="2026-02-18 06:54:56.771066396 +0000 UTC m=+4033.419025550" Feb 18 06:54:59 crc kubenswrapper[4707]: I0218 06:54:59.890701 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:54:59 crc kubenswrapper[4707]: I0218 06:54:59.891422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:55:00 crc kubenswrapper[4707]: I0218 06:55:00.143587 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:55:00 crc kubenswrapper[4707]: I0218 06:55:00.143919 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:55:00 crc kubenswrapper[4707]: I0218 06:55:00.942350 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wx9bd" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="registry-server" probeResult="failure" output=< Feb 18 06:55:00 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 06:55:00 crc kubenswrapper[4707]: > Feb 18 06:55:01 crc kubenswrapper[4707]: I0218 06:55:01.202231 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rplpk" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="registry-server" probeResult="failure" output=< Feb 18 06:55:01 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 06:55:01 crc kubenswrapper[4707]: > Feb 18 06:55:09 crc kubenswrapper[4707]: I0218 06:55:09.942150 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:55:09 crc kubenswrapper[4707]: I0218 06:55:09.990833 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:55:10 crc kubenswrapper[4707]: I0218 06:55:10.194671 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:55:10 crc kubenswrapper[4707]: I0218 06:55:10.247140 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:55:11 crc kubenswrapper[4707]: I0218 06:55:11.170353 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wx9bd"] Feb 18 06:55:11 crc kubenswrapper[4707]: I0218 06:55:11.881842 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wx9bd" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="registry-server" containerID="cri-o://7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6" gracePeriod=2 Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.573836 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rplpk"] Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.574789 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rplpk" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="registry-server" containerID="cri-o://c3ee8ee112d8e5f0e37fa7f92fc7ac8b4ac8fa8f2bdd91154e115024a0675a8d" gracePeriod=2 Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.666677 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.705535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-catalog-content\") pod \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.705621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-utilities\") pod \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.705726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpg4g\" (UniqueName: \"kubernetes.io/projected/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-kube-api-access-zpg4g\") pod \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\" (UID: \"8065e0b0-ab41-415c-a16e-fa5da8c8ce96\") " Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.707245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-utilities" (OuterVolumeSpecName: "utilities") pod "8065e0b0-ab41-415c-a16e-fa5da8c8ce96" (UID: "8065e0b0-ab41-415c-a16e-fa5da8c8ce96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.714818 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-kube-api-access-zpg4g" (OuterVolumeSpecName: "kube-api-access-zpg4g") pod "8065e0b0-ab41-415c-a16e-fa5da8c8ce96" (UID: "8065e0b0-ab41-415c-a16e-fa5da8c8ce96"). InnerVolumeSpecName "kube-api-access-zpg4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.814463 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.814504 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpg4g\" (UniqueName: \"kubernetes.io/projected/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-kube-api-access-zpg4g\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.847136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8065e0b0-ab41-415c-a16e-fa5da8c8ce96" (UID: "8065e0b0-ab41-415c-a16e-fa5da8c8ce96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.894258 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerID="c3ee8ee112d8e5f0e37fa7f92fc7ac8b4ac8fa8f2bdd91154e115024a0675a8d" exitCode=0 Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.894328 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rplpk" event={"ID":"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98","Type":"ContainerDied","Data":"c3ee8ee112d8e5f0e37fa7f92fc7ac8b4ac8fa8f2bdd91154e115024a0675a8d"} Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.896480 4707 generic.go:334] "Generic (PLEG): container finished" podID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerID="7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6" exitCode=0 Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.896507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wx9bd" event={"ID":"8065e0b0-ab41-415c-a16e-fa5da8c8ce96","Type":"ContainerDied","Data":"7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6"} Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.896525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wx9bd" event={"ID":"8065e0b0-ab41-415c-a16e-fa5da8c8ce96","Type":"ContainerDied","Data":"8a0afb30071ae7ef254d89b4745a012700086187f0be1e52bca873af5d11d18d"} Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.896540 4707 scope.go:117] "RemoveContainer" containerID="7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.896654 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wx9bd" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.916152 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065e0b0-ab41-415c-a16e-fa5da8c8ce96-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.922634 4707 scope.go:117] "RemoveContainer" containerID="f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb" Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.936921 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wx9bd"] Feb 18 06:55:12 crc kubenswrapper[4707]: I0218 06:55:12.952844 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wx9bd"] Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.005032 4707 scope.go:117] "RemoveContainer" containerID="0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.089122 4707 scope.go:117] "RemoveContainer" containerID="7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6" Feb 18 06:55:13 crc kubenswrapper[4707]: E0218 06:55:13.090348 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6\": container with ID starting with 7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6 not found: ID does not exist" containerID="7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.090384 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6"} err="failed to get container status \"7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6\": rpc error: code = NotFound desc = could not find container \"7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6\": container with ID starting with 7cc0fc1ea4504f66ef54f66620c97a64a697c836a48d71a93bc2d0f8e7fe13f6 not found: ID does not exist" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.090441 4707 scope.go:117] "RemoveContainer" containerID="f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb" Feb 18 06:55:13 crc kubenswrapper[4707]: E0218 06:55:13.090711 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb\": container with ID starting with f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb not found: ID does not exist" containerID="f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.090731 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb"} err="failed to get container status \"f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb\": rpc error: code = NotFound desc = could not find container \"f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb\": container with ID starting with f76f2419794455dfb1f60fa5a586240ef82d8d9fca93da8211bdda25fd8ac9cb not found: ID does not exist" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.090746 4707 scope.go:117] "RemoveContainer" containerID="0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70" Feb 18 06:55:13 crc kubenswrapper[4707]: E0218 06:55:13.091031 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70\": container with ID starting with 0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70 not found: ID does not exist" containerID="0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.091053 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70"} err="failed to get container status \"0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70\": rpc error: code = NotFound desc = could not find container \"0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70\": container with ID starting with 0b865c07bd65ab73e1dae9f2e2990bafb97cb5d3931d3ebe54089e933cf90b70 not found: ID does not exist" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.340004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.425180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56rtb\" (UniqueName: \"kubernetes.io/projected/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-kube-api-access-56rtb\") pod \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.425293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-utilities\") pod \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.425441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-catalog-content\") pod \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\" (UID: \"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98\") " Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.430391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-utilities" (OuterVolumeSpecName: "utilities") pod "d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" (UID: "d8804b7f-8076-4327-8d6f-9b7cbf8b0f98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.435099 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-kube-api-access-56rtb" (OuterVolumeSpecName: "kube-api-access-56rtb") pod "d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" (UID: "d8804b7f-8076-4327-8d6f-9b7cbf8b0f98"). InnerVolumeSpecName "kube-api-access-56rtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.490024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" (UID: "d8804b7f-8076-4327-8d6f-9b7cbf8b0f98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.528152 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56rtb\" (UniqueName: \"kubernetes.io/projected/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-kube-api-access-56rtb\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.528190 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.528200 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.908128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rplpk" event={"ID":"d8804b7f-8076-4327-8d6f-9b7cbf8b0f98","Type":"ContainerDied","Data":"c990d1886cbcaeb83885c8e41b313226c823e541259e2013dcac0d9d2f57fb76"} Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.908183 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rplpk" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.908196 4707 scope.go:117] "RemoveContainer" containerID="c3ee8ee112d8e5f0e37fa7f92fc7ac8b4ac8fa8f2bdd91154e115024a0675a8d" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.928372 4707 scope.go:117] "RemoveContainer" containerID="b746c4720498dd80eef6363ca35a09e2f9a36281f8d7782495bc0bb05d0bf4c9" Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.943993 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rplpk"] Feb 18 06:55:13 crc kubenswrapper[4707]: I0218 06:55:13.951001 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rplpk"] Feb 18 06:55:14 crc kubenswrapper[4707]: I0218 06:55:14.069814 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" path="/var/lib/kubelet/pods/8065e0b0-ab41-415c-a16e-fa5da8c8ce96/volumes" Feb 18 06:55:14 crc kubenswrapper[4707]: I0218 06:55:14.070627 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" path="/var/lib/kubelet/pods/d8804b7f-8076-4327-8d6f-9b7cbf8b0f98/volumes" Feb 18 06:55:14 crc kubenswrapper[4707]: I0218 06:55:14.350076 4707 scope.go:117] "RemoveContainer" containerID="202cdb729e9dbc70a5427b4e8c7389f293646dd59e0116fd6b704a875a02ed9d" Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.382427 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.383424 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.383512 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.385032 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fe239756b1524d5039921de98375e95766bc6e0c5eee5de9059af464be56f58"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.385113 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://9fe239756b1524d5039921de98375e95766bc6e0c5eee5de9059af464be56f58" gracePeriod=600 Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.982255 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="9fe239756b1524d5039921de98375e95766bc6e0c5eee5de9059af464be56f58" exitCode=0 Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.982292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"9fe239756b1524d5039921de98375e95766bc6e0c5eee5de9059af464be56f58"} Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.982822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e"} Feb 18 06:55:21 crc kubenswrapper[4707]: I0218 06:55:21.982843 4707 scope.go:117] "RemoveContainer" containerID="eefb8db7fe75d35a8c7e4cb50ee1a9ee0caa6c798d56fd41caf197ab0d190fc2" Feb 18 06:55:28 crc kubenswrapper[4707]: I0218 06:55:28.180402 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-878756b99-xx5vn" podUID="a6b4c749-b753-42b9-8bc7-fb25121f0ea8" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 18 06:57:21 crc kubenswrapper[4707]: I0218 06:57:21.381833 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:57:21 crc kubenswrapper[4707]: I0218 06:57:21.382469 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:57:51 crc kubenswrapper[4707]: I0218 06:57:51.382126 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:57:51 crc kubenswrapper[4707]: I0218 06:57:51.382634 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:58:21 crc kubenswrapper[4707]: I0218 06:58:21.381977 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 06:58:21 crc kubenswrapper[4707]: I0218 06:58:21.383506 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 06:58:21 crc kubenswrapper[4707]: I0218 06:58:21.383623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 06:58:21 crc kubenswrapper[4707]: I0218 06:58:21.525958 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 06:58:21 crc kubenswrapper[4707]: I0218 06:58:21.526061 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" gracePeriod=600 Feb 18 06:58:21 crc kubenswrapper[4707]: E0218 06:58:21.649472 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:58:22 crc kubenswrapper[4707]: I0218 06:58:22.533726 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" exitCode=0 Feb 18 06:58:22 crc kubenswrapper[4707]: I0218 06:58:22.533776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e"} Feb 18 06:58:22 crc kubenswrapper[4707]: I0218 06:58:22.534150 4707 scope.go:117] "RemoveContainer" containerID="9fe239756b1524d5039921de98375e95766bc6e0c5eee5de9059af464be56f58" Feb 18 06:58:22 crc kubenswrapper[4707]: I0218 06:58:22.534841 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:58:22 crc kubenswrapper[4707]: E0218 06:58:22.535161 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:58:37 crc kubenswrapper[4707]: I0218 06:58:37.054869 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:58:37 crc kubenswrapper[4707]: E0218 06:58:37.055704 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:58:52 crc kubenswrapper[4707]: I0218 06:58:52.053590 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:58:52 crc kubenswrapper[4707]: E0218 06:58:52.054296 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:59:07 crc kubenswrapper[4707]: I0218 06:59:07.053932 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:59:07 crc kubenswrapper[4707]: E0218 06:59:07.055217 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:59:18 crc kubenswrapper[4707]: I0218 06:59:18.053617 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:59:18 crc kubenswrapper[4707]: E0218 06:59:18.054465 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:59:33 crc kubenswrapper[4707]: I0218 06:59:33.053023 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:59:33 crc kubenswrapper[4707]: E0218 06:59:33.054260 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.558538 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cd4mk"] Feb 18 06:59:36 crc kubenswrapper[4707]: E0218 06:59:36.559418 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="extract-utilities" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559437 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="extract-utilities" Feb 18 06:59:36 crc kubenswrapper[4707]: E0218 06:59:36.559452 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="extract-content" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559460 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="extract-content" Feb 18 06:59:36 crc kubenswrapper[4707]: E0218 06:59:36.559481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="extract-content" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="extract-content" Feb 18 06:59:36 crc kubenswrapper[4707]: E0218 06:59:36.559511 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="registry-server" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559518 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="registry-server" Feb 18 06:59:36 crc kubenswrapper[4707]: E0218 06:59:36.559543 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="extract-utilities" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559550 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="extract-utilities" Feb 18 06:59:36 crc kubenswrapper[4707]: E0218 06:59:36.559565 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="registry-server" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559572 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="registry-server" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559781 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8804b7f-8076-4327-8d6f-9b7cbf8b0f98" containerName="registry-server" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.559833 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8065e0b0-ab41-415c-a16e-fa5da8c8ce96" containerName="registry-server" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.561467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.569681 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cd4mk"] Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.664604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-utilities\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.664883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4jj\" (UniqueName: \"kubernetes.io/projected/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-kube-api-access-rg4jj\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.664923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-catalog-content\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.767538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-utilities\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.767588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4jj\" (UniqueName: \"kubernetes.io/projected/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-kube-api-access-rg4jj\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.767616 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-catalog-content\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.768085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-utilities\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.768203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-catalog-content\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.794712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4jj\" (UniqueName: \"kubernetes.io/projected/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-kube-api-access-rg4jj\") pod \"community-operators-cd4mk\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:36 crc kubenswrapper[4707]: I0218 06:59:36.884419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:37 crc kubenswrapper[4707]: I0218 06:59:37.497142 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cd4mk"] Feb 18 06:59:38 crc kubenswrapper[4707]: I0218 06:59:38.161415 4707 generic.go:334] "Generic (PLEG): container finished" podID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerID="500ee61064d4bb5eecbb8fba5dfbf375080ee0666b4922cd1596be7c740e391c" exitCode=0 Feb 18 06:59:38 crc kubenswrapper[4707]: I0218 06:59:38.161612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cd4mk" event={"ID":"02dae9e5-d9de-44a8-a40d-e94a45e4e87c","Type":"ContainerDied","Data":"500ee61064d4bb5eecbb8fba5dfbf375080ee0666b4922cd1596be7c740e391c"} Feb 18 06:59:38 crc kubenswrapper[4707]: I0218 06:59:38.161780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cd4mk" event={"ID":"02dae9e5-d9de-44a8-a40d-e94a45e4e87c","Type":"ContainerStarted","Data":"bbbd89bca9e8df406f2339e67d999ed6fe4a79884c5924318a4a346a34dbe071"} Feb 18 06:59:39 crc kubenswrapper[4707]: I0218 06:59:39.173925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cd4mk" event={"ID":"02dae9e5-d9de-44a8-a40d-e94a45e4e87c","Type":"ContainerStarted","Data":"db7b1c7d13ffb3d9cc70fe7c5b287d3739a1648751f74148694902c29ab2a924"} Feb 18 06:59:41 crc kubenswrapper[4707]: I0218 06:59:41.196118 4707 generic.go:334] "Generic (PLEG): container finished" podID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerID="db7b1c7d13ffb3d9cc70fe7c5b287d3739a1648751f74148694902c29ab2a924" exitCode=0 Feb 18 06:59:41 crc kubenswrapper[4707]: I0218 06:59:41.196208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cd4mk" event={"ID":"02dae9e5-d9de-44a8-a40d-e94a45e4e87c","Type":"ContainerDied","Data":"db7b1c7d13ffb3d9cc70fe7c5b287d3739a1648751f74148694902c29ab2a924"} Feb 18 06:59:42 crc kubenswrapper[4707]: I0218 06:59:42.215891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cd4mk" event={"ID":"02dae9e5-d9de-44a8-a40d-e94a45e4e87c","Type":"ContainerStarted","Data":"fa61c1f9d5205fd9a9e64f51a49c2f89f6622e30209a3d971943313d8ef4bc2f"} Feb 18 06:59:42 crc kubenswrapper[4707]: I0218 06:59:42.238298 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cd4mk" podStartSLOduration=2.6513706089999998 podStartE2EDuration="6.238279884s" podCreationTimestamp="2026-02-18 06:59:36 +0000 UTC" firstStartedPulling="2026-02-18 06:59:38.163338125 +0000 UTC m=+4314.811297259" lastFinishedPulling="2026-02-18 06:59:41.7502474 +0000 UTC m=+4318.398206534" observedRunningTime="2026-02-18 06:59:42.237039071 +0000 UTC m=+4318.884998215" watchObservedRunningTime="2026-02-18 06:59:42.238279884 +0000 UTC m=+4318.886239018" Feb 18 06:59:44 crc kubenswrapper[4707]: I0218 06:59:44.064441 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:59:44 crc kubenswrapper[4707]: E0218 06:59:44.066483 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 06:59:46 crc kubenswrapper[4707]: I0218 06:59:46.884706 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:46 crc kubenswrapper[4707]: I0218 06:59:46.885060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:46 crc kubenswrapper[4707]: I0218 06:59:46.939530 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:47 crc kubenswrapper[4707]: I0218 06:59:47.296816 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:47 crc kubenswrapper[4707]: I0218 06:59:47.346702 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cd4mk"] Feb 18 06:59:49 crc kubenswrapper[4707]: I0218 06:59:49.266278 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cd4mk" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="registry-server" containerID="cri-o://fa61c1f9d5205fd9a9e64f51a49c2f89f6622e30209a3d971943313d8ef4bc2f" gracePeriod=2 Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.277863 4707 generic.go:334] "Generic (PLEG): container finished" podID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerID="fa61c1f9d5205fd9a9e64f51a49c2f89f6622e30209a3d971943313d8ef4bc2f" exitCode=0 Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.277937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cd4mk" event={"ID":"02dae9e5-d9de-44a8-a40d-e94a45e4e87c","Type":"ContainerDied","Data":"fa61c1f9d5205fd9a9e64f51a49c2f89f6622e30209a3d971943313d8ef4bc2f"} Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.469162 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.554984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-utilities\") pod \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.555138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-catalog-content\") pod \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.555293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4jj\" (UniqueName: \"kubernetes.io/projected/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-kube-api-access-rg4jj\") pod \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\" (UID: \"02dae9e5-d9de-44a8-a40d-e94a45e4e87c\") " Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.556524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-utilities" (OuterVolumeSpecName: "utilities") pod "02dae9e5-d9de-44a8-a40d-e94a45e4e87c" (UID: "02dae9e5-d9de-44a8-a40d-e94a45e4e87c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.566145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-kube-api-access-rg4jj" (OuterVolumeSpecName: "kube-api-access-rg4jj") pod "02dae9e5-d9de-44a8-a40d-e94a45e4e87c" (UID: "02dae9e5-d9de-44a8-a40d-e94a45e4e87c"). InnerVolumeSpecName "kube-api-access-rg4jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.613359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02dae9e5-d9de-44a8-a40d-e94a45e4e87c" (UID: "02dae9e5-d9de-44a8-a40d-e94a45e4e87c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.657383 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.657419 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 06:59:50 crc kubenswrapper[4707]: I0218 06:59:50.657431 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4jj\" (UniqueName: \"kubernetes.io/projected/02dae9e5-d9de-44a8-a40d-e94a45e4e87c-kube-api-access-rg4jj\") on node \"crc\" DevicePath \"\"" Feb 18 06:59:51 crc kubenswrapper[4707]: I0218 06:59:51.289623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cd4mk" event={"ID":"02dae9e5-d9de-44a8-a40d-e94a45e4e87c","Type":"ContainerDied","Data":"bbbd89bca9e8df406f2339e67d999ed6fe4a79884c5924318a4a346a34dbe071"} Feb 18 06:59:51 crc kubenswrapper[4707]: I0218 06:59:51.289687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cd4mk" Feb 18 06:59:51 crc kubenswrapper[4707]: I0218 06:59:51.290106 4707 scope.go:117] "RemoveContainer" containerID="fa61c1f9d5205fd9a9e64f51a49c2f89f6622e30209a3d971943313d8ef4bc2f" Feb 18 06:59:51 crc kubenswrapper[4707]: I0218 06:59:51.312713 4707 scope.go:117] "RemoveContainer" containerID="db7b1c7d13ffb3d9cc70fe7c5b287d3739a1648751f74148694902c29ab2a924" Feb 18 06:59:51 crc kubenswrapper[4707]: I0218 06:59:51.336404 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cd4mk"] Feb 18 06:59:51 crc kubenswrapper[4707]: I0218 06:59:51.344368 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cd4mk"] Feb 18 06:59:51 crc kubenswrapper[4707]: I0218 06:59:51.357075 4707 scope.go:117] "RemoveContainer" containerID="500ee61064d4bb5eecbb8fba5dfbf375080ee0666b4922cd1596be7c740e391c" Feb 18 06:59:52 crc kubenswrapper[4707]: I0218 06:59:52.077608 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" path="/var/lib/kubelet/pods/02dae9e5-d9de-44a8-a40d-e94a45e4e87c/volumes" Feb 18 06:59:55 crc kubenswrapper[4707]: I0218 06:59:55.054722 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 06:59:55 crc kubenswrapper[4707]: E0218 06:59:55.055546 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.196736 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb"] Feb 18 07:00:00 crc kubenswrapper[4707]: E0218 07:00:00.197868 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="extract-utilities" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.197887 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="extract-utilities" Feb 18 07:00:00 crc kubenswrapper[4707]: E0218 07:00:00.197917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="extract-content" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.197925 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="extract-content" Feb 18 07:00:00 crc kubenswrapper[4707]: E0218 07:00:00.197940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="registry-server" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.197948 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="registry-server" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.198200 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="02dae9e5-d9de-44a8-a40d-e94a45e4e87c" containerName="registry-server" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.199495 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.203345 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.204620 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.214002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb"] Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.246306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjmr5\" (UniqueName: \"kubernetes.io/projected/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-kube-api-access-kjmr5\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.246476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-secret-volume\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.246508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-config-volume\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.348149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjmr5\" (UniqueName: \"kubernetes.io/projected/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-kube-api-access-kjmr5\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.348242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-secret-volume\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.348274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-config-volume\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.349431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-config-volume\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.354234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-secret-volume\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.366733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjmr5\" (UniqueName: \"kubernetes.io/projected/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-kube-api-access-kjmr5\") pod \"collect-profiles-29523300-n2gkb\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:00 crc kubenswrapper[4707]: I0218 07:00:00.530639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:01 crc kubenswrapper[4707]: I0218 07:00:01.021695 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb"] Feb 18 07:00:02 crc kubenswrapper[4707]: I0218 07:00:02.387195 4707 generic.go:334] "Generic (PLEG): container finished" podID="2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c" containerID="cb41e3be56e5d7008c051648300408d0813084bb7e3907aef2db54e5ce389ba9" exitCode=0 Feb 18 07:00:02 crc kubenswrapper[4707]: I0218 07:00:02.387307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" event={"ID":"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c","Type":"ContainerDied","Data":"cb41e3be56e5d7008c051648300408d0813084bb7e3907aef2db54e5ce389ba9"} Feb 18 07:00:02 crc kubenswrapper[4707]: I0218 07:00:02.387684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" event={"ID":"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c","Type":"ContainerStarted","Data":"7bf1e8334fc342ec4946e6b2efd8e5164e6de17f25bc8ba016a6d3d33c933e99"} Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.832906 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.920842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-config-volume\") pod \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.921254 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-config-volume" (OuterVolumeSpecName: "config-volume") pod "2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c" (UID: "2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.921428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-secret-volume\") pod \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.921554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjmr5\" (UniqueName: \"kubernetes.io/projected/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-kube-api-access-kjmr5\") pod \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\" (UID: \"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c\") " Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.922078 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.927748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-kube-api-access-kjmr5" (OuterVolumeSpecName: "kube-api-access-kjmr5") pod "2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c" (UID: "2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c"). InnerVolumeSpecName "kube-api-access-kjmr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:00:03 crc kubenswrapper[4707]: I0218 07:00:03.941611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c" (UID: "2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:00:04 crc kubenswrapper[4707]: I0218 07:00:04.023627 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjmr5\" (UniqueName: \"kubernetes.io/projected/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-kube-api-access-kjmr5\") on node \"crc\" DevicePath \"\"" Feb 18 07:00:04 crc kubenswrapper[4707]: I0218 07:00:04.023664 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 07:00:04 crc kubenswrapper[4707]: I0218 07:00:04.404711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" event={"ID":"2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c","Type":"ContainerDied","Data":"7bf1e8334fc342ec4946e6b2efd8e5164e6de17f25bc8ba016a6d3d33c933e99"} Feb 18 07:00:04 crc kubenswrapper[4707]: I0218 07:00:04.404749 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523300-n2gkb" Feb 18 07:00:04 crc kubenswrapper[4707]: I0218 07:00:04.404761 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bf1e8334fc342ec4946e6b2efd8e5164e6de17f25bc8ba016a6d3d33c933e99" Feb 18 07:00:04 crc kubenswrapper[4707]: I0218 07:00:04.914544 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n"] Feb 18 07:00:04 crc kubenswrapper[4707]: I0218 07:00:04.924473 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523255-vft5n"] Feb 18 07:00:06 crc kubenswrapper[4707]: I0218 07:00:06.066340 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8019647-1774-483b-b11e-b478b894487f" path="/var/lib/kubelet/pods/e8019647-1774-483b-b11e-b478b894487f/volumes" Feb 18 07:00:08 crc kubenswrapper[4707]: I0218 07:00:08.053247 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:00:08 crc kubenswrapper[4707]: E0218 07:00:08.054223 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:00:21 crc kubenswrapper[4707]: I0218 07:00:21.053354 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:00:21 crc kubenswrapper[4707]: E0218 07:00:21.054398 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:00:32 crc kubenswrapper[4707]: I0218 07:00:32.052701 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:00:32 crc kubenswrapper[4707]: E0218 07:00:32.053542 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:00:44 crc kubenswrapper[4707]: I0218 07:00:44.061924 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:00:44 crc kubenswrapper[4707]: E0218 07:00:44.063085 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:00:50 crc kubenswrapper[4707]: I0218 07:00:50.607492 4707 scope.go:117] "RemoveContainer" containerID="692387cbd9ffa3ca4a91d95e9bad67c1250a94a77fdb6d86a94157bbebbd572c" Feb 18 07:00:59 crc kubenswrapper[4707]: I0218 07:00:59.053187 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:00:59 crc kubenswrapper[4707]: E0218 07:00:59.054152 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.150077 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29523301-k4v49"] Feb 18 07:01:00 crc kubenswrapper[4707]: E0218 07:01:00.150909 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c" containerName="collect-profiles" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.150929 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c" containerName="collect-profiles" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.151229 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbf1b2b-009e-4c2e-9869-18b5ff8c6f8c" containerName="collect-profiles" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.152002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.169628 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523301-k4v49"] Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.291326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-config-data\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.291370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-fernet-keys\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.291445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdlbs\" (UniqueName: \"kubernetes.io/projected/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-kube-api-access-bdlbs\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.291528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-combined-ca-bundle\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.393657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-config-data\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.393701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-fernet-keys\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.393767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdlbs\" (UniqueName: \"kubernetes.io/projected/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-kube-api-access-bdlbs\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.393821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-combined-ca-bundle\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.409256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-combined-ca-bundle\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.409353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-config-data\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.412215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-fernet-keys\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.413065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdlbs\" (UniqueName: \"kubernetes.io/projected/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-kube-api-access-bdlbs\") pod \"keystone-cron-29523301-k4v49\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.504388 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:00 crc kubenswrapper[4707]: I0218 07:01:00.957761 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523301-k4v49"] Feb 18 07:01:01 crc kubenswrapper[4707]: I0218 07:01:01.882707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523301-k4v49" event={"ID":"8fcf8387-c297-4bb6-acc4-810bb4fab9e5","Type":"ContainerStarted","Data":"9809bceb13d7198c974175de70ae3f0ebfcaa2819ffabe6e227a3e09e569022d"} Feb 18 07:01:01 crc kubenswrapper[4707]: I0218 07:01:01.883576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523301-k4v49" event={"ID":"8fcf8387-c297-4bb6-acc4-810bb4fab9e5","Type":"ContainerStarted","Data":"744b7c1ceea823f47d296e6905aca7607d0f0a5a094e5444675f12d36572ba9e"} Feb 18 07:01:01 crc kubenswrapper[4707]: I0218 07:01:01.905368 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29523301-k4v49" podStartSLOduration=1.9053445359999999 podStartE2EDuration="1.905344536s" podCreationTimestamp="2026-02-18 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 07:01:01.896164257 +0000 UTC m=+4398.544123411" watchObservedRunningTime="2026-02-18 07:01:01.905344536 +0000 UTC m=+4398.553303670" Feb 18 07:01:04 crc kubenswrapper[4707]: I0218 07:01:04.911342 4707 generic.go:334] "Generic (PLEG): container finished" podID="8fcf8387-c297-4bb6-acc4-810bb4fab9e5" containerID="9809bceb13d7198c974175de70ae3f0ebfcaa2819ffabe6e227a3e09e569022d" exitCode=0 Feb 18 07:01:04 crc kubenswrapper[4707]: I0218 07:01:04.911373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523301-k4v49" event={"ID":"8fcf8387-c297-4bb6-acc4-810bb4fab9e5","Type":"ContainerDied","Data":"9809bceb13d7198c974175de70ae3f0ebfcaa2819ffabe6e227a3e09e569022d"} Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.438265 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.456472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-config-data\") pod \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.456591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-fernet-keys\") pod \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.456680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdlbs\" (UniqueName: \"kubernetes.io/projected/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-kube-api-access-bdlbs\") pod \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.456861 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-combined-ca-bundle\") pod \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\" (UID: \"8fcf8387-c297-4bb6-acc4-810bb4fab9e5\") " Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.471198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-kube-api-access-bdlbs" (OuterVolumeSpecName: "kube-api-access-bdlbs") pod "8fcf8387-c297-4bb6-acc4-810bb4fab9e5" (UID: "8fcf8387-c297-4bb6-acc4-810bb4fab9e5"). InnerVolumeSpecName "kube-api-access-bdlbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.472149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8fcf8387-c297-4bb6-acc4-810bb4fab9e5" (UID: "8fcf8387-c297-4bb6-acc4-810bb4fab9e5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.491610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fcf8387-c297-4bb6-acc4-810bb4fab9e5" (UID: "8fcf8387-c297-4bb6-acc4-810bb4fab9e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.516955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-config-data" (OuterVolumeSpecName: "config-data") pod "8fcf8387-c297-4bb6-acc4-810bb4fab9e5" (UID: "8fcf8387-c297-4bb6-acc4-810bb4fab9e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.559234 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.559273 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.559286 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdlbs\" (UniqueName: \"kubernetes.io/projected/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-kube-api-access-bdlbs\") on node \"crc\" DevicePath \"\"" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.559298 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcf8387-c297-4bb6-acc4-810bb4fab9e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.931764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523301-k4v49" event={"ID":"8fcf8387-c297-4bb6-acc4-810bb4fab9e5","Type":"ContainerDied","Data":"744b7c1ceea823f47d296e6905aca7607d0f0a5a094e5444675f12d36572ba9e"} Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.932132 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="744b7c1ceea823f47d296e6905aca7607d0f0a5a094e5444675f12d36572ba9e" Feb 18 07:01:06 crc kubenswrapper[4707]: I0218 07:01:06.931864 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523301-k4v49" Feb 18 07:01:14 crc kubenswrapper[4707]: I0218 07:01:14.062005 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:01:14 crc kubenswrapper[4707]: E0218 07:01:14.062832 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:01:26 crc kubenswrapper[4707]: I0218 07:01:26.053086 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:01:26 crc kubenswrapper[4707]: E0218 07:01:26.053947 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:01:39 crc kubenswrapper[4707]: I0218 07:01:39.054121 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:01:39 crc kubenswrapper[4707]: E0218 07:01:39.055150 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:01:52 crc kubenswrapper[4707]: I0218 07:01:52.053510 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:01:52 crc kubenswrapper[4707]: E0218 07:01:52.054418 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:02:06 crc kubenswrapper[4707]: I0218 07:02:06.053762 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:02:06 crc kubenswrapper[4707]: E0218 07:02:06.054730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:02:19 crc kubenswrapper[4707]: I0218 07:02:19.053572 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:02:19 crc kubenswrapper[4707]: E0218 07:02:19.054538 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:02:31 crc kubenswrapper[4707]: I0218 07:02:31.053573 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:02:31 crc kubenswrapper[4707]: E0218 07:02:31.054525 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:02:43 crc kubenswrapper[4707]: I0218 07:02:43.053046 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:02:43 crc kubenswrapper[4707]: E0218 07:02:43.053910 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:02:54 crc kubenswrapper[4707]: I0218 07:02:54.083376 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:02:54 crc kubenswrapper[4707]: E0218 07:02:54.084269 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:03:05 crc kubenswrapper[4707]: I0218 07:03:05.056038 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:03:05 crc kubenswrapper[4707]: E0218 07:03:05.056914 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:03:16 crc kubenswrapper[4707]: I0218 07:03:16.129769 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:03:16 crc kubenswrapper[4707]: E0218 07:03:16.130498 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:03:31 crc kubenswrapper[4707]: I0218 07:03:31.054117 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:03:31 crc kubenswrapper[4707]: I0218 07:03:31.868898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"75729da4be911af2deb616fb3bb9270ac6b0fe680ac28da2d1b09117643f8e74"} Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.759258 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xvt7g"] Feb 18 07:04:09 crc kubenswrapper[4707]: E0218 07:04:09.760508 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fcf8387-c297-4bb6-acc4-810bb4fab9e5" containerName="keystone-cron" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.760526 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcf8387-c297-4bb6-acc4-810bb4fab9e5" containerName="keystone-cron" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.761018 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fcf8387-c297-4bb6-acc4-810bb4fab9e5" containerName="keystone-cron" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.762885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.771327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvt7g"] Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.863210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-utilities\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.863259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj2cb\" (UniqueName: \"kubernetes.io/projected/a5eb9f52-a91d-4945-9bdd-393be47f95ce-kube-api-access-qj2cb\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.863295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-catalog-content\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.965454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-utilities\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.965750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj2cb\" (UniqueName: \"kubernetes.io/projected/a5eb9f52-a91d-4945-9bdd-393be47f95ce-kube-api-access-qj2cb\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.965876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-catalog-content\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.966159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-utilities\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.966182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-catalog-content\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:09 crc kubenswrapper[4707]: I0218 07:04:09.990930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj2cb\" (UniqueName: \"kubernetes.io/projected/a5eb9f52-a91d-4945-9bdd-393be47f95ce-kube-api-access-qj2cb\") pod \"redhat-marketplace-xvt7g\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:10 crc kubenswrapper[4707]: I0218 07:04:10.088463 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:10 crc kubenswrapper[4707]: I0218 07:04:10.563942 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvt7g"] Feb 18 07:04:11 crc kubenswrapper[4707]: I0218 07:04:11.214886 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerID="9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679" exitCode=0 Feb 18 07:04:11 crc kubenswrapper[4707]: I0218 07:04:11.214967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvt7g" event={"ID":"a5eb9f52-a91d-4945-9bdd-393be47f95ce","Type":"ContainerDied","Data":"9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679"} Feb 18 07:04:11 crc kubenswrapper[4707]: I0218 07:04:11.215207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvt7g" event={"ID":"a5eb9f52-a91d-4945-9bdd-393be47f95ce","Type":"ContainerStarted","Data":"971945f2d10bf59dc25e67f4dfc750deaa898af495449cf59ba8e5d3ed9a7c4c"} Feb 18 07:04:11 crc kubenswrapper[4707]: I0218 07:04:11.217165 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 07:04:12 crc kubenswrapper[4707]: I0218 07:04:12.229044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvt7g" event={"ID":"a5eb9f52-a91d-4945-9bdd-393be47f95ce","Type":"ContainerStarted","Data":"5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66"} Feb 18 07:04:13 crc kubenswrapper[4707]: I0218 07:04:13.239246 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerID="5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66" exitCode=0 Feb 18 07:04:13 crc kubenswrapper[4707]: I0218 07:04:13.239300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvt7g" event={"ID":"a5eb9f52-a91d-4945-9bdd-393be47f95ce","Type":"ContainerDied","Data":"5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66"} Feb 18 07:04:14 crc kubenswrapper[4707]: I0218 07:04:14.249988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvt7g" event={"ID":"a5eb9f52-a91d-4945-9bdd-393be47f95ce","Type":"ContainerStarted","Data":"b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed"} Feb 18 07:04:14 crc kubenswrapper[4707]: I0218 07:04:14.275563 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xvt7g" podStartSLOduration=2.850587906 podStartE2EDuration="5.275544638s" podCreationTimestamp="2026-02-18 07:04:09 +0000 UTC" firstStartedPulling="2026-02-18 07:04:11.216938044 +0000 UTC m=+4587.864897178" lastFinishedPulling="2026-02-18 07:04:13.641894776 +0000 UTC m=+4590.289853910" observedRunningTime="2026-02-18 07:04:14.26709173 +0000 UTC m=+4590.915050874" watchObservedRunningTime="2026-02-18 07:04:14.275544638 +0000 UTC m=+4590.923503772" Feb 18 07:04:20 crc kubenswrapper[4707]: I0218 07:04:20.089151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:20 crc kubenswrapper[4707]: I0218 07:04:20.089953 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:20 crc kubenswrapper[4707]: I0218 07:04:20.150258 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:20 crc kubenswrapper[4707]: I0218 07:04:20.370909 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:20 crc kubenswrapper[4707]: I0218 07:04:20.436158 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvt7g"] Feb 18 07:04:22 crc kubenswrapper[4707]: I0218 07:04:22.316429 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xvt7g" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="registry-server" containerID="cri-o://b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed" gracePeriod=2 Feb 18 07:04:22 crc kubenswrapper[4707]: I0218 07:04:22.937247 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.036116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-utilities\") pod \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.036240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj2cb\" (UniqueName: \"kubernetes.io/projected/a5eb9f52-a91d-4945-9bdd-393be47f95ce-kube-api-access-qj2cb\") pod \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.036299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-catalog-content\") pod \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\" (UID: \"a5eb9f52-a91d-4945-9bdd-393be47f95ce\") " Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.037200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-utilities" (OuterVolumeSpecName: "utilities") pod "a5eb9f52-a91d-4945-9bdd-393be47f95ce" (UID: "a5eb9f52-a91d-4945-9bdd-393be47f95ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.046305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5eb9f52-a91d-4945-9bdd-393be47f95ce-kube-api-access-qj2cb" (OuterVolumeSpecName: "kube-api-access-qj2cb") pod "a5eb9f52-a91d-4945-9bdd-393be47f95ce" (UID: "a5eb9f52-a91d-4945-9bdd-393be47f95ce"). InnerVolumeSpecName "kube-api-access-qj2cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.063863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5eb9f52-a91d-4945-9bdd-393be47f95ce" (UID: "a5eb9f52-a91d-4945-9bdd-393be47f95ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.138347 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj2cb\" (UniqueName: \"kubernetes.io/projected/a5eb9f52-a91d-4945-9bdd-393be47f95ce-kube-api-access-qj2cb\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.138387 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.138396 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5eb9f52-a91d-4945-9bdd-393be47f95ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.325285 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerID="b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed" exitCode=0 Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.325325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvt7g" event={"ID":"a5eb9f52-a91d-4945-9bdd-393be47f95ce","Type":"ContainerDied","Data":"b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed"} Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.325351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvt7g" event={"ID":"a5eb9f52-a91d-4945-9bdd-393be47f95ce","Type":"ContainerDied","Data":"971945f2d10bf59dc25e67f4dfc750deaa898af495449cf59ba8e5d3ed9a7c4c"} Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.325366 4707 scope.go:117] "RemoveContainer" containerID="b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.325481 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvt7g" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.354623 4707 scope.go:117] "RemoveContainer" containerID="5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.365253 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvt7g"] Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.374610 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvt7g"] Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.383109 4707 scope.go:117] "RemoveContainer" containerID="9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.418283 4707 scope.go:117] "RemoveContainer" containerID="b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed" Feb 18 07:04:23 crc kubenswrapper[4707]: E0218 07:04:23.418805 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed\": container with ID starting with b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed not found: ID does not exist" containerID="b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.418835 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed"} err="failed to get container status \"b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed\": rpc error: code = NotFound desc = could not find container \"b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed\": container with ID starting with b1fcfee0871f5f5084eae067a8ff2b86a40e90978703a7d4bc046ea75964e9ed not found: ID does not exist" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.418855 4707 scope.go:117] "RemoveContainer" containerID="5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66" Feb 18 07:04:23 crc kubenswrapper[4707]: E0218 07:04:23.419236 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66\": container with ID starting with 5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66 not found: ID does not exist" containerID="5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.419262 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66"} err="failed to get container status \"5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66\": rpc error: code = NotFound desc = could not find container \"5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66\": container with ID starting with 5e405e088afa559455fda8ab737d3824abbc2a5ff1cb128e0bb83e27e67b2f66 not found: ID does not exist" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.419276 4707 scope.go:117] "RemoveContainer" containerID="9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679" Feb 18 07:04:23 crc kubenswrapper[4707]: E0218 07:04:23.419750 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679\": container with ID starting with 9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679 not found: ID does not exist" containerID="9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679" Feb 18 07:04:23 crc kubenswrapper[4707]: I0218 07:04:23.419829 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679"} err="failed to get container status \"9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679\": rpc error: code = NotFound desc = could not find container \"9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679\": container with ID starting with 9fa59a29f324e0a1f9329435c3628dfa52710288fe1b9bee634709920c5de679 not found: ID does not exist" Feb 18 07:04:24 crc kubenswrapper[4707]: I0218 07:04:24.064155 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" path="/var/lib/kubelet/pods/a5eb9f52-a91d-4945-9bdd-393be47f95ce/volumes" Feb 18 07:04:50 crc kubenswrapper[4707]: I0218 07:04:50.654128 4707 generic.go:334] "Generic (PLEG): container finished" podID="e369f41f-534e-48ee-bdcb-da26b742cfc3" containerID="595bf5305d0d9237a2ee119ae6203e07713a8164a83768166db93d45cf8a5124" exitCode=0 Feb 18 07:04:50 crc kubenswrapper[4707]: I0218 07:04:50.654240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e369f41f-534e-48ee-bdcb-da26b742cfc3","Type":"ContainerDied","Data":"595bf5305d0d9237a2ee119ae6203e07713a8164a83768166db93d45cf8a5124"} Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.025946 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-workdir\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-temporary\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfn9q\" (UniqueName: \"kubernetes.io/projected/e369f41f-534e-48ee-bdcb-da26b742cfc3-kube-api-access-dfn9q\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ssh-key\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config-secret\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-config-data\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.076663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ca-certs\") pod \"e369f41f-534e-48ee-bdcb-da26b742cfc3\" (UID: \"e369f41f-534e-48ee-bdcb-da26b742cfc3\") " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.080332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.082067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-config-data" (OuterVolumeSpecName: "config-data") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.084346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.089720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.091436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e369f41f-534e-48ee-bdcb-da26b742cfc3-kube-api-access-dfn9q" (OuterVolumeSpecName: "kube-api-access-dfn9q") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "kube-api-access-dfn9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.110212 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.110739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.120286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.136761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e369f41f-534e-48ee-bdcb-da26b742cfc3" (UID: "e369f41f-534e-48ee-bdcb-da26b742cfc3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179089 4707 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179131 4707 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e369f41f-534e-48ee-bdcb-da26b742cfc3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179148 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfn9q\" (UniqueName: \"kubernetes.io/projected/e369f41f-534e-48ee-bdcb-da26b742cfc3-kube-api-access-dfn9q\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179185 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179200 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179211 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179223 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179238 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e369f41f-534e-48ee-bdcb-da26b742cfc3-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.179249 4707 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e369f41f-534e-48ee-bdcb-da26b742cfc3-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.202961 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.281047 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.672550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e369f41f-534e-48ee-bdcb-da26b742cfc3","Type":"ContainerDied","Data":"8a70ea655f8a17102d760869bfe181df5573edbdd0ebec0f46cdff9a32a8dd56"} Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.672870 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a70ea655f8a17102d760869bfe181df5573edbdd0ebec0f46cdff9a32a8dd56" Feb 18 07:04:52 crc kubenswrapper[4707]: I0218 07:04:52.672625 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.664541 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mj79f"] Feb 18 07:05:03 crc kubenswrapper[4707]: E0218 07:05:03.665484 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="extract-content" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.665499 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="extract-content" Feb 18 07:05:03 crc kubenswrapper[4707]: E0218 07:05:03.665515 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="extract-utilities" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.665523 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="extract-utilities" Feb 18 07:05:03 crc kubenswrapper[4707]: E0218 07:05:03.665550 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="registry-server" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.665556 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="registry-server" Feb 18 07:05:03 crc kubenswrapper[4707]: E0218 07:05:03.665574 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e369f41f-534e-48ee-bdcb-da26b742cfc3" containerName="tempest-tests-tempest-tests-runner" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.665579 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e369f41f-534e-48ee-bdcb-da26b742cfc3" containerName="tempest-tests-tempest-tests-runner" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.665760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5eb9f52-a91d-4945-9bdd-393be47f95ce" containerName="registry-server" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.665783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e369f41f-534e-48ee-bdcb-da26b742cfc3" containerName="tempest-tests-tempest-tests-runner" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.667169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.714054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mj79f"] Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.802896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8swq\" (UniqueName: \"kubernetes.io/projected/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-kube-api-access-j8swq\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.803256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-utilities\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.803443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-catalog-content\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.844646 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.846699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.869553 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.905545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-catalog-content\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.905639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcfq7\" (UniqueName: \"kubernetes.io/projected/3c04cb22-866a-403f-9a07-1a12cfd909e2-kube-api-access-rcfq7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c04cb22-866a-403f-9a07-1a12cfd909e2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.906130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8swq\" (UniqueName: \"kubernetes.io/projected/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-kube-api-access-j8swq\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.906189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c04cb22-866a-403f-9a07-1a12cfd909e2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.906300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-utilities\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.906358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-catalog-content\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.906661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-utilities\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.930601 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8swq\" (UniqueName: \"kubernetes.io/projected/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-kube-api-access-j8swq\") pod \"certified-operators-mj79f\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:03 crc kubenswrapper[4707]: I0218 07:05:03.998671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.008019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c04cb22-866a-403f-9a07-1a12cfd909e2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.008169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcfq7\" (UniqueName: \"kubernetes.io/projected/3c04cb22-866a-403f-9a07-1a12cfd909e2-kube-api-access-rcfq7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c04cb22-866a-403f-9a07-1a12cfd909e2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.008556 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c04cb22-866a-403f-9a07-1a12cfd909e2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.029292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcfq7\" (UniqueName: \"kubernetes.io/projected/3c04cb22-866a-403f-9a07-1a12cfd909e2-kube-api-access-rcfq7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c04cb22-866a-403f-9a07-1a12cfd909e2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.040643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c04cb22-866a-403f-9a07-1a12cfd909e2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.170103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.530338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mj79f"] Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.728465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.782396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3c04cb22-866a-403f-9a07-1a12cfd909e2","Type":"ContainerStarted","Data":"c223e8831e7eed8d130e956cfcfd94c984f4fdf3df86a971a1dc67ee4994ad28"} Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.784971 4707 generic.go:334] "Generic (PLEG): container finished" podID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerID="bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d" exitCode=0 Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.785021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj79f" event={"ID":"aafc4782-ef50-4e42-9e23-48e2aa21fdb6","Type":"ContainerDied","Data":"bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d"} Feb 18 07:05:04 crc kubenswrapper[4707]: I0218 07:05:04.785047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj79f" event={"ID":"aafc4782-ef50-4e42-9e23-48e2aa21fdb6","Type":"ContainerStarted","Data":"b8136c6cb8474c0abe9bf5e11a34d52861fc3a692ee989c793e0477252f7a875"} Feb 18 07:05:05 crc kubenswrapper[4707]: I0218 07:05:05.794869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3c04cb22-866a-403f-9a07-1a12cfd909e2","Type":"ContainerStarted","Data":"ff5f44d108efdc6bfded25bf568060570ac13a6e067dec020d6b443eb6f3a688"} Feb 18 07:05:05 crc kubenswrapper[4707]: I0218 07:05:05.796929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj79f" event={"ID":"aafc4782-ef50-4e42-9e23-48e2aa21fdb6","Type":"ContainerStarted","Data":"2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1"} Feb 18 07:05:05 crc kubenswrapper[4707]: I0218 07:05:05.812046 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.030485719 podStartE2EDuration="2.812026212s" podCreationTimestamp="2026-02-18 07:05:03 +0000 UTC" firstStartedPulling="2026-02-18 07:05:04.736528327 +0000 UTC m=+4641.384487461" lastFinishedPulling="2026-02-18 07:05:05.51806882 +0000 UTC m=+4642.166027954" observedRunningTime="2026-02-18 07:05:05.8086233 +0000 UTC m=+4642.456582434" watchObservedRunningTime="2026-02-18 07:05:05.812026212 +0000 UTC m=+4642.459985346" Feb 18 07:05:07 crc kubenswrapper[4707]: I0218 07:05:07.815944 4707 generic.go:334] "Generic (PLEG): container finished" podID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerID="2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1" exitCode=0 Feb 18 07:05:07 crc kubenswrapper[4707]: I0218 07:05:07.816019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj79f" event={"ID":"aafc4782-ef50-4e42-9e23-48e2aa21fdb6","Type":"ContainerDied","Data":"2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1"} Feb 18 07:05:09 crc kubenswrapper[4707]: I0218 07:05:09.833537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj79f" event={"ID":"aafc4782-ef50-4e42-9e23-48e2aa21fdb6","Type":"ContainerStarted","Data":"54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a"} Feb 18 07:05:09 crc kubenswrapper[4707]: I0218 07:05:09.861937 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mj79f" podStartSLOduration=3.376590024 podStartE2EDuration="6.861897622s" podCreationTimestamp="2026-02-18 07:05:03 +0000 UTC" firstStartedPulling="2026-02-18 07:05:04.786505889 +0000 UTC m=+4641.434465023" lastFinishedPulling="2026-02-18 07:05:08.271813487 +0000 UTC m=+4644.919772621" observedRunningTime="2026-02-18 07:05:09.853525136 +0000 UTC m=+4646.501484310" watchObservedRunningTime="2026-02-18 07:05:09.861897622 +0000 UTC m=+4646.509856796" Feb 18 07:05:13 crc kubenswrapper[4707]: I0218 07:05:13.999020 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:14 crc kubenswrapper[4707]: I0218 07:05:13.999756 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:14 crc kubenswrapper[4707]: I0218 07:05:14.051335 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:14 crc kubenswrapper[4707]: I0218 07:05:14.915484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:14 crc kubenswrapper[4707]: I0218 07:05:14.968531 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mj79f"] Feb 18 07:05:17 crc kubenswrapper[4707]: I0218 07:05:17.533525 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mj79f" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="registry-server" containerID="cri-o://54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a" gracePeriod=2 Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.457629 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.547932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-utilities\") pod \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.547978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-catalog-content\") pod \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.548217 4707 generic.go:334] "Generic (PLEG): container finished" podID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerID="54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a" exitCode=0 Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.548259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj79f" event={"ID":"aafc4782-ef50-4e42-9e23-48e2aa21fdb6","Type":"ContainerDied","Data":"54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a"} Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.548293 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj79f" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.548319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj79f" event={"ID":"aafc4782-ef50-4e42-9e23-48e2aa21fdb6","Type":"ContainerDied","Data":"b8136c6cb8474c0abe9bf5e11a34d52861fc3a692ee989c793e0477252f7a875"} Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.548344 4707 scope.go:117] "RemoveContainer" containerID="54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.548343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8swq\" (UniqueName: \"kubernetes.io/projected/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-kube-api-access-j8swq\") pod \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\" (UID: \"aafc4782-ef50-4e42-9e23-48e2aa21fdb6\") " Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.548921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-utilities" (OuterVolumeSpecName: "utilities") pod "aafc4782-ef50-4e42-9e23-48e2aa21fdb6" (UID: "aafc4782-ef50-4e42-9e23-48e2aa21fdb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.552312 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.583273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-kube-api-access-j8swq" (OuterVolumeSpecName: "kube-api-access-j8swq") pod "aafc4782-ef50-4e42-9e23-48e2aa21fdb6" (UID: "aafc4782-ef50-4e42-9e23-48e2aa21fdb6"). InnerVolumeSpecName "kube-api-access-j8swq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.584216 4707 scope.go:117] "RemoveContainer" containerID="2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.622603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aafc4782-ef50-4e42-9e23-48e2aa21fdb6" (UID: "aafc4782-ef50-4e42-9e23-48e2aa21fdb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.630949 4707 scope.go:117] "RemoveContainer" containerID="bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.653882 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.654096 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8swq\" (UniqueName: \"kubernetes.io/projected/aafc4782-ef50-4e42-9e23-48e2aa21fdb6-kube-api-access-j8swq\") on node \"crc\" DevicePath \"\"" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.673350 4707 scope.go:117] "RemoveContainer" containerID="54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a" Feb 18 07:05:18 crc kubenswrapper[4707]: E0218 07:05:18.673713 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a\": container with ID starting with 54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a not found: ID does not exist" containerID="54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.673743 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a"} err="failed to get container status \"54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a\": rpc error: code = NotFound desc = could not find container \"54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a\": container with ID starting with 54f7594dcbb3e29c40ecedfa42e1979ee944cfdd260d7d8877e3db50ebb80b9a not found: ID does not exist" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.673769 4707 scope.go:117] "RemoveContainer" containerID="2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1" Feb 18 07:05:18 crc kubenswrapper[4707]: E0218 07:05:18.674120 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1\": container with ID starting with 2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1 not found: ID does not exist" containerID="2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.674160 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1"} err="failed to get container status \"2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1\": rpc error: code = NotFound desc = could not find container \"2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1\": container with ID starting with 2ec0a46b6fd687104470eb91548fc6e298cc82faefe09ba105aed48a72b328d1 not found: ID does not exist" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.674180 4707 scope.go:117] "RemoveContainer" containerID="bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d" Feb 18 07:05:18 crc kubenswrapper[4707]: E0218 07:05:18.674410 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d\": container with ID starting with bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d not found: ID does not exist" containerID="bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.674446 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d"} err="failed to get container status \"bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d\": rpc error: code = NotFound desc = could not find container \"bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d\": container with ID starting with bc1789b50c59a35d463602682b84205295dc7237e13be7aa5ff4d4352366c92d not found: ID does not exist" Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.889871 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mj79f"] Feb 18 07:05:18 crc kubenswrapper[4707]: I0218 07:05:18.899489 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mj79f"] Feb 18 07:05:20 crc kubenswrapper[4707]: I0218 07:05:20.070648 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" path="/var/lib/kubelet/pods/aafc4782-ef50-4e42-9e23-48e2aa21fdb6/volumes" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.332183 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-frwrf/must-gather-98km4"] Feb 18 07:05:28 crc kubenswrapper[4707]: E0218 07:05:28.333193 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="extract-content" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.333210 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="extract-content" Feb 18 07:05:28 crc kubenswrapper[4707]: E0218 07:05:28.333239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="registry-server" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.333247 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="registry-server" Feb 18 07:05:28 crc kubenswrapper[4707]: E0218 07:05:28.333259 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="extract-utilities" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.333266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="extract-utilities" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.333495 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aafc4782-ef50-4e42-9e23-48e2aa21fdb6" containerName="registry-server" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.334913 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.336942 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-frwrf"/"kube-root-ca.crt" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.337158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-frwrf"/"default-dockercfg-cmd9m" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.337547 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-frwrf"/"openshift-service-ca.crt" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.343641 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-frwrf/must-gather-98km4"] Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.457612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0409d47c-4b51-4b82-86e9-be8f5fc24024-must-gather-output\") pod \"must-gather-98km4\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.457977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wtqp\" (UniqueName: \"kubernetes.io/projected/0409d47c-4b51-4b82-86e9-be8f5fc24024-kube-api-access-8wtqp\") pod \"must-gather-98km4\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.559580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wtqp\" (UniqueName: \"kubernetes.io/projected/0409d47c-4b51-4b82-86e9-be8f5fc24024-kube-api-access-8wtqp\") pod \"must-gather-98km4\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.559720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0409d47c-4b51-4b82-86e9-be8f5fc24024-must-gather-output\") pod \"must-gather-98km4\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.560150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0409d47c-4b51-4b82-86e9-be8f5fc24024-must-gather-output\") pod \"must-gather-98km4\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.587544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wtqp\" (UniqueName: \"kubernetes.io/projected/0409d47c-4b51-4b82-86e9-be8f5fc24024-kube-api-access-8wtqp\") pod \"must-gather-98km4\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:28 crc kubenswrapper[4707]: I0218 07:05:28.651464 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:05:29 crc kubenswrapper[4707]: I0218 07:05:29.207480 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-frwrf/must-gather-98km4"] Feb 18 07:05:29 crc kubenswrapper[4707]: I0218 07:05:29.642115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/must-gather-98km4" event={"ID":"0409d47c-4b51-4b82-86e9-be8f5fc24024","Type":"ContainerStarted","Data":"8e6bd136da71219b81f8a3fcd95a2950f0347213a7ebfedf3c09ed89294ffbbf"} Feb 18 07:05:35 crc kubenswrapper[4707]: I0218 07:05:35.701142 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/must-gather-98km4" event={"ID":"0409d47c-4b51-4b82-86e9-be8f5fc24024","Type":"ContainerStarted","Data":"36363cb52d45dc1edada9d23f42f1a2ceda1fc8d376433f9d011172a6d0342bd"} Feb 18 07:05:36 crc kubenswrapper[4707]: I0218 07:05:36.711474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/must-gather-98km4" event={"ID":"0409d47c-4b51-4b82-86e9-be8f5fc24024","Type":"ContainerStarted","Data":"295cb615aacc716f0c84378671a49b8267f27d88d4e7ef7f26ae4ff525a3e237"} Feb 18 07:05:36 crc kubenswrapper[4707]: I0218 07:05:36.737284 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-frwrf/must-gather-98km4" podStartSLOduration=2.714944245 podStartE2EDuration="8.736910666s" podCreationTimestamp="2026-02-18 07:05:28 +0000 UTC" firstStartedPulling="2026-02-18 07:05:29.209181539 +0000 UTC m=+4665.857140673" lastFinishedPulling="2026-02-18 07:05:35.23114796 +0000 UTC m=+4671.879107094" observedRunningTime="2026-02-18 07:05:36.725909598 +0000 UTC m=+4673.373868742" watchObservedRunningTime="2026-02-18 07:05:36.736910666 +0000 UTC m=+4673.384869800" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.338775 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w6kpv"] Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.340726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.353721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6kpv"] Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.442551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-utilities\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.443684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4l9r\" (UniqueName: \"kubernetes.io/projected/0676f4fd-2ce3-494f-a118-d49f8478a7bf-kube-api-access-m4l9r\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.443859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-catalog-content\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.545529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-utilities\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.545598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4l9r\" (UniqueName: \"kubernetes.io/projected/0676f4fd-2ce3-494f-a118-d49f8478a7bf-kube-api-access-m4l9r\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.545637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-catalog-content\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.546393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-utilities\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.546448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-catalog-content\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.578456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4l9r\" (UniqueName: \"kubernetes.io/projected/0676f4fd-2ce3-494f-a118-d49f8478a7bf-kube-api-access-m4l9r\") pod \"redhat-operators-w6kpv\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:37 crc kubenswrapper[4707]: I0218 07:05:37.673752 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:38 crc kubenswrapper[4707]: W0218 07:05:38.177688 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0676f4fd_2ce3_494f_a118_d49f8478a7bf.slice/crio-8910bb144a1b971cbabaa3c4ddee30ede53f40c6e031846ad9b9da94910b6687 WatchSource:0}: Error finding container 8910bb144a1b971cbabaa3c4ddee30ede53f40c6e031846ad9b9da94910b6687: Status 404 returned error can't find the container with id 8910bb144a1b971cbabaa3c4ddee30ede53f40c6e031846ad9b9da94910b6687 Feb 18 07:05:38 crc kubenswrapper[4707]: I0218 07:05:38.183431 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6kpv"] Feb 18 07:05:38 crc kubenswrapper[4707]: I0218 07:05:38.740757 4707 generic.go:334] "Generic (PLEG): container finished" podID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerID="ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229" exitCode=0 Feb 18 07:05:38 crc kubenswrapper[4707]: I0218 07:05:38.740874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6kpv" event={"ID":"0676f4fd-2ce3-494f-a118-d49f8478a7bf","Type":"ContainerDied","Data":"ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229"} Feb 18 07:05:38 crc kubenswrapper[4707]: I0218 07:05:38.741110 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6kpv" event={"ID":"0676f4fd-2ce3-494f-a118-d49f8478a7bf","Type":"ContainerStarted","Data":"8910bb144a1b971cbabaa3c4ddee30ede53f40c6e031846ad9b9da94910b6687"} Feb 18 07:05:42 crc kubenswrapper[4707]: I0218 07:05:42.794047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6kpv" event={"ID":"0676f4fd-2ce3-494f-a118-d49f8478a7bf","Type":"ContainerStarted","Data":"fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457"} Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.377894 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-frwrf/crc-debug-ndfd8"] Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.379889 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.490330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b82f35b-febc-4d12-96a0-2cee407bab52-host\") pod \"crc-debug-ndfd8\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.490379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsf5\" (UniqueName: \"kubernetes.io/projected/3b82f35b-febc-4d12-96a0-2cee407bab52-kube-api-access-qpsf5\") pod \"crc-debug-ndfd8\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.592926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b82f35b-febc-4d12-96a0-2cee407bab52-host\") pod \"crc-debug-ndfd8\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.592975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsf5\" (UniqueName: \"kubernetes.io/projected/3b82f35b-febc-4d12-96a0-2cee407bab52-kube-api-access-qpsf5\") pod \"crc-debug-ndfd8\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.593353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b82f35b-febc-4d12-96a0-2cee407bab52-host\") pod \"crc-debug-ndfd8\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.611883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsf5\" (UniqueName: \"kubernetes.io/projected/3b82f35b-febc-4d12-96a0-2cee407bab52-kube-api-access-qpsf5\") pod \"crc-debug-ndfd8\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.697880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:05:44 crc kubenswrapper[4707]: I0218 07:05:44.819569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" event={"ID":"3b82f35b-febc-4d12-96a0-2cee407bab52","Type":"ContainerStarted","Data":"5af5e845e3fcf9f806d4e5445e07db6c5030b90e09d0e7cac76847c95dfc710c"} Feb 18 07:05:46 crc kubenswrapper[4707]: I0218 07:05:46.836770 4707 generic.go:334] "Generic (PLEG): container finished" podID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerID="fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457" exitCode=0 Feb 18 07:05:46 crc kubenswrapper[4707]: I0218 07:05:46.836860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6kpv" event={"ID":"0676f4fd-2ce3-494f-a118-d49f8478a7bf","Type":"ContainerDied","Data":"fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457"} Feb 18 07:05:47 crc kubenswrapper[4707]: I0218 07:05:47.849859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6kpv" event={"ID":"0676f4fd-2ce3-494f-a118-d49f8478a7bf","Type":"ContainerStarted","Data":"7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745"} Feb 18 07:05:47 crc kubenswrapper[4707]: I0218 07:05:47.878247 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w6kpv" podStartSLOduration=2.215859899 podStartE2EDuration="10.878225079s" podCreationTimestamp="2026-02-18 07:05:37 +0000 UTC" firstStartedPulling="2026-02-18 07:05:38.74610015 +0000 UTC m=+4675.394059284" lastFinishedPulling="2026-02-18 07:05:47.40846533 +0000 UTC m=+4684.056424464" observedRunningTime="2026-02-18 07:05:47.866559294 +0000 UTC m=+4684.514518428" watchObservedRunningTime="2026-02-18 07:05:47.878225079 +0000 UTC m=+4684.526184223" Feb 18 07:05:51 crc kubenswrapper[4707]: I0218 07:05:51.382841 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:05:51 crc kubenswrapper[4707]: I0218 07:05:51.383456 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:05:57 crc kubenswrapper[4707]: I0218 07:05:57.674686 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:57 crc kubenswrapper[4707]: I0218 07:05:57.675501 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:57 crc kubenswrapper[4707]: I0218 07:05:57.834763 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:57 crc kubenswrapper[4707]: I0218 07:05:57.954343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" event={"ID":"3b82f35b-febc-4d12-96a0-2cee407bab52","Type":"ContainerStarted","Data":"527f296409fdaa7ef7791b434fa4141714f3b03abf5f0d1a993397a3c04ff467"} Feb 18 07:05:57 crc kubenswrapper[4707]: I0218 07:05:57.971988 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" podStartSLOduration=1.622425413 podStartE2EDuration="13.971971692s" podCreationTimestamp="2026-02-18 07:05:44 +0000 UTC" firstStartedPulling="2026-02-18 07:05:44.729452266 +0000 UTC m=+4681.377411400" lastFinishedPulling="2026-02-18 07:05:57.078998545 +0000 UTC m=+4693.726957679" observedRunningTime="2026-02-18 07:05:57.967077629 +0000 UTC m=+4694.615036763" watchObservedRunningTime="2026-02-18 07:05:57.971971692 +0000 UTC m=+4694.619930826" Feb 18 07:05:58 crc kubenswrapper[4707]: I0218 07:05:58.028846 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:05:58 crc kubenswrapper[4707]: I0218 07:05:58.090160 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6kpv"] Feb 18 07:05:59 crc kubenswrapper[4707]: I0218 07:05:59.981259 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w6kpv" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="registry-server" containerID="cri-o://7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745" gracePeriod=2 Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.484893 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.656235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-catalog-content\") pod \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.656386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-utilities\") pod \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.656442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4l9r\" (UniqueName: \"kubernetes.io/projected/0676f4fd-2ce3-494f-a118-d49f8478a7bf-kube-api-access-m4l9r\") pod \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\" (UID: \"0676f4fd-2ce3-494f-a118-d49f8478a7bf\") " Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.657477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-utilities" (OuterVolumeSpecName: "utilities") pod "0676f4fd-2ce3-494f-a118-d49f8478a7bf" (UID: "0676f4fd-2ce3-494f-a118-d49f8478a7bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.663387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0676f4fd-2ce3-494f-a118-d49f8478a7bf-kube-api-access-m4l9r" (OuterVolumeSpecName: "kube-api-access-m4l9r") pod "0676f4fd-2ce3-494f-a118-d49f8478a7bf" (UID: "0676f4fd-2ce3-494f-a118-d49f8478a7bf"). InnerVolumeSpecName "kube-api-access-m4l9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.758507 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.758550 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4l9r\" (UniqueName: \"kubernetes.io/projected/0676f4fd-2ce3-494f-a118-d49f8478a7bf-kube-api-access-m4l9r\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.882391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0676f4fd-2ce3-494f-a118-d49f8478a7bf" (UID: "0676f4fd-2ce3-494f-a118-d49f8478a7bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.962942 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676f4fd-2ce3-494f-a118-d49f8478a7bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.995396 4707 generic.go:334] "Generic (PLEG): container finished" podID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerID="7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745" exitCode=0 Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.995433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6kpv" event={"ID":"0676f4fd-2ce3-494f-a118-d49f8478a7bf","Type":"ContainerDied","Data":"7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745"} Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.996568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6kpv" event={"ID":"0676f4fd-2ce3-494f-a118-d49f8478a7bf","Type":"ContainerDied","Data":"8910bb144a1b971cbabaa3c4ddee30ede53f40c6e031846ad9b9da94910b6687"} Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.995539 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6kpv" Feb 18 07:06:00 crc kubenswrapper[4707]: I0218 07:06:00.996668 4707 scope.go:117] "RemoveContainer" containerID="7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.030515 4707 scope.go:117] "RemoveContainer" containerID="fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.066090 4707 scope.go:117] "RemoveContainer" containerID="ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.066230 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6kpv"] Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.075570 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w6kpv"] Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.116786 4707 scope.go:117] "RemoveContainer" containerID="7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745" Feb 18 07:06:01 crc kubenswrapper[4707]: E0218 07:06:01.117235 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745\": container with ID starting with 7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745 not found: ID does not exist" containerID="7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.117338 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745"} err="failed to get container status \"7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745\": rpc error: code = NotFound desc = could not find container \"7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745\": container with ID starting with 7521b8add2e095193cb577f0bc7c740b5882014a479e36d6566799b79408c745 not found: ID does not exist" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.117460 4707 scope.go:117] "RemoveContainer" containerID="fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457" Feb 18 07:06:01 crc kubenswrapper[4707]: E0218 07:06:01.117723 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457\": container with ID starting with fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457 not found: ID does not exist" containerID="fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.117816 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457"} err="failed to get container status \"fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457\": rpc error: code = NotFound desc = could not find container \"fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457\": container with ID starting with fb25c9cfb5d39e0587a8146420b8350cb036512b1883f193b83863f373036457 not found: ID does not exist" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.117913 4707 scope.go:117] "RemoveContainer" containerID="ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229" Feb 18 07:06:01 crc kubenswrapper[4707]: E0218 07:06:01.119285 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229\": container with ID starting with ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229 not found: ID does not exist" containerID="ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229" Feb 18 07:06:01 crc kubenswrapper[4707]: I0218 07:06:01.119376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229"} err="failed to get container status \"ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229\": rpc error: code = NotFound desc = could not find container \"ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229\": container with ID starting with ad9610471da75a4f1fbe25a77f5b0bd3b6008e972887678e6bc3648ae9f81229 not found: ID does not exist" Feb 18 07:06:02 crc kubenswrapper[4707]: I0218 07:06:02.063880 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" path="/var/lib/kubelet/pods/0676f4fd-2ce3-494f-a118-d49f8478a7bf/volumes" Feb 18 07:06:21 crc kubenswrapper[4707]: I0218 07:06:21.382809 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:06:21 crc kubenswrapper[4707]: I0218 07:06:21.383388 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:06:48 crc kubenswrapper[4707]: I0218 07:06:48.492367 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b82f35b-febc-4d12-96a0-2cee407bab52" containerID="527f296409fdaa7ef7791b434fa4141714f3b03abf5f0d1a993397a3c04ff467" exitCode=0 Feb 18 07:06:48 crc kubenswrapper[4707]: I0218 07:06:48.492532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" event={"ID":"3b82f35b-febc-4d12-96a0-2cee407bab52","Type":"ContainerDied","Data":"527f296409fdaa7ef7791b434fa4141714f3b03abf5f0d1a993397a3c04ff467"} Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.614840 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.645781 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-frwrf/crc-debug-ndfd8"] Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.659901 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-frwrf/crc-debug-ndfd8"] Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.815665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpsf5\" (UniqueName: \"kubernetes.io/projected/3b82f35b-febc-4d12-96a0-2cee407bab52-kube-api-access-qpsf5\") pod \"3b82f35b-febc-4d12-96a0-2cee407bab52\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.815724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b82f35b-febc-4d12-96a0-2cee407bab52-host\") pod \"3b82f35b-febc-4d12-96a0-2cee407bab52\" (UID: \"3b82f35b-febc-4d12-96a0-2cee407bab52\") " Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.815908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b82f35b-febc-4d12-96a0-2cee407bab52-host" (OuterVolumeSpecName: "host") pod "3b82f35b-febc-4d12-96a0-2cee407bab52" (UID: "3b82f35b-febc-4d12-96a0-2cee407bab52"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.816445 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b82f35b-febc-4d12-96a0-2cee407bab52-host\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.824121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b82f35b-febc-4d12-96a0-2cee407bab52-kube-api-access-qpsf5" (OuterVolumeSpecName: "kube-api-access-qpsf5") pod "3b82f35b-febc-4d12-96a0-2cee407bab52" (UID: "3b82f35b-febc-4d12-96a0-2cee407bab52"). InnerVolumeSpecName "kube-api-access-qpsf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:06:49 crc kubenswrapper[4707]: I0218 07:06:49.918116 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpsf5\" (UniqueName: \"kubernetes.io/projected/3b82f35b-febc-4d12-96a0-2cee407bab52-kube-api-access-qpsf5\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.069757 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b82f35b-febc-4d12-96a0-2cee407bab52" path="/var/lib/kubelet/pods/3b82f35b-febc-4d12-96a0-2cee407bab52/volumes" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.509516 4707 scope.go:117] "RemoveContainer" containerID="527f296409fdaa7ef7791b434fa4141714f3b03abf5f0d1a993397a3c04ff467" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.509681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-ndfd8" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.792750 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-frwrf/crc-debug-5bvvq"] Feb 18 07:06:50 crc kubenswrapper[4707]: E0218 07:06:50.793579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="extract-utilities" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.793599 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="extract-utilities" Feb 18 07:06:50 crc kubenswrapper[4707]: E0218 07:06:50.793615 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b82f35b-febc-4d12-96a0-2cee407bab52" containerName="container-00" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.793624 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b82f35b-febc-4d12-96a0-2cee407bab52" containerName="container-00" Feb 18 07:06:50 crc kubenswrapper[4707]: E0218 07:06:50.793641 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="registry-server" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.793650 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="registry-server" Feb 18 07:06:50 crc kubenswrapper[4707]: E0218 07:06:50.793674 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="extract-content" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.793682 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="extract-content" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.793926 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b82f35b-febc-4d12-96a0-2cee407bab52" containerName="container-00" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.793956 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0676f4fd-2ce3-494f-a118-d49f8478a7bf" containerName="registry-server" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.794641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.834993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrs4r\" (UniqueName: \"kubernetes.io/projected/9a5e127d-52bb-4a33-b163-f337511595e6-kube-api-access-zrs4r\") pod \"crc-debug-5bvvq\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.835315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a5e127d-52bb-4a33-b163-f337511595e6-host\") pod \"crc-debug-5bvvq\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.938630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrs4r\" (UniqueName: \"kubernetes.io/projected/9a5e127d-52bb-4a33-b163-f337511595e6-kube-api-access-zrs4r\") pod \"crc-debug-5bvvq\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.938787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a5e127d-52bb-4a33-b163-f337511595e6-host\") pod \"crc-debug-5bvvq\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.939037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a5e127d-52bb-4a33-b163-f337511595e6-host\") pod \"crc-debug-5bvvq\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:50 crc kubenswrapper[4707]: I0218 07:06:50.956430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrs4r\" (UniqueName: \"kubernetes.io/projected/9a5e127d-52bb-4a33-b163-f337511595e6-kube-api-access-zrs4r\") pod \"crc-debug-5bvvq\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.109004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.382697 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.383221 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.383364 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.384475 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75729da4be911af2deb616fb3bb9270ac6b0fe680ac28da2d1b09117643f8e74"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.384634 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://75729da4be911af2deb616fb3bb9270ac6b0fe680ac28da2d1b09117643f8e74" gracePeriod=600 Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.519261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" event={"ID":"9a5e127d-52bb-4a33-b163-f337511595e6","Type":"ContainerStarted","Data":"876fc04fe13e167ec6453b0bab5515445b0d7f2e322731088c4e0fefec425934"} Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.519309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" event={"ID":"9a5e127d-52bb-4a33-b163-f337511595e6","Type":"ContainerStarted","Data":"796ec81eeb2e6ece65d076a7dd1caf696aa8d3a4684f7b4a2bb59138f6fab5b2"} Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.522260 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="75729da4be911af2deb616fb3bb9270ac6b0fe680ac28da2d1b09117643f8e74" exitCode=0 Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.522298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"75729da4be911af2deb616fb3bb9270ac6b0fe680ac28da2d1b09117643f8e74"} Feb 18 07:06:51 crc kubenswrapper[4707]: I0218 07:06:51.522324 4707 scope.go:117] "RemoveContainer" containerID="6964f06dbadc62d0cc95d0ad88074be4cbe35a9a0190139337ace2a94f99f74e" Feb 18 07:06:52 crc kubenswrapper[4707]: I0218 07:06:52.533225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8"} Feb 18 07:06:52 crc kubenswrapper[4707]: I0218 07:06:52.534768 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a5e127d-52bb-4a33-b163-f337511595e6" containerID="876fc04fe13e167ec6453b0bab5515445b0d7f2e322731088c4e0fefec425934" exitCode=0 Feb 18 07:06:52 crc kubenswrapper[4707]: I0218 07:06:52.534821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" event={"ID":"9a5e127d-52bb-4a33-b163-f337511595e6","Type":"ContainerDied","Data":"876fc04fe13e167ec6453b0bab5515445b0d7f2e322731088c4e0fefec425934"} Feb 18 07:06:53 crc kubenswrapper[4707]: I0218 07:06:53.671779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:53 crc kubenswrapper[4707]: I0218 07:06:53.698723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrs4r\" (UniqueName: \"kubernetes.io/projected/9a5e127d-52bb-4a33-b163-f337511595e6-kube-api-access-zrs4r\") pod \"9a5e127d-52bb-4a33-b163-f337511595e6\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " Feb 18 07:06:53 crc kubenswrapper[4707]: I0218 07:06:53.698832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a5e127d-52bb-4a33-b163-f337511595e6-host\") pod \"9a5e127d-52bb-4a33-b163-f337511595e6\" (UID: \"9a5e127d-52bb-4a33-b163-f337511595e6\") " Feb 18 07:06:53 crc kubenswrapper[4707]: I0218 07:06:53.699536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a5e127d-52bb-4a33-b163-f337511595e6-host" (OuterVolumeSpecName: "host") pod "9a5e127d-52bb-4a33-b163-f337511595e6" (UID: "9a5e127d-52bb-4a33-b163-f337511595e6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 07:06:53 crc kubenswrapper[4707]: I0218 07:06:53.713230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5e127d-52bb-4a33-b163-f337511595e6-kube-api-access-zrs4r" (OuterVolumeSpecName: "kube-api-access-zrs4r") pod "9a5e127d-52bb-4a33-b163-f337511595e6" (UID: "9a5e127d-52bb-4a33-b163-f337511595e6"). InnerVolumeSpecName "kube-api-access-zrs4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:06:53 crc kubenswrapper[4707]: I0218 07:06:53.801777 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrs4r\" (UniqueName: \"kubernetes.io/projected/9a5e127d-52bb-4a33-b163-f337511595e6-kube-api-access-zrs4r\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:53 crc kubenswrapper[4707]: I0218 07:06:53.801862 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a5e127d-52bb-4a33-b163-f337511595e6-host\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:54 crc kubenswrapper[4707]: I0218 07:06:54.556341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" event={"ID":"9a5e127d-52bb-4a33-b163-f337511595e6","Type":"ContainerDied","Data":"796ec81eeb2e6ece65d076a7dd1caf696aa8d3a4684f7b4a2bb59138f6fab5b2"} Feb 18 07:06:54 crc kubenswrapper[4707]: I0218 07:06:54.556656 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="796ec81eeb2e6ece65d076a7dd1caf696aa8d3a4684f7b4a2bb59138f6fab5b2" Feb 18 07:06:54 crc kubenswrapper[4707]: I0218 07:06:54.556431 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-5bvvq" Feb 18 07:06:55 crc kubenswrapper[4707]: I0218 07:06:55.147712 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-frwrf/crc-debug-5bvvq"] Feb 18 07:06:55 crc kubenswrapper[4707]: I0218 07:06:55.156654 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-frwrf/crc-debug-5bvvq"] Feb 18 07:06:56 crc kubenswrapper[4707]: I0218 07:06:56.778247 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5e127d-52bb-4a33-b163-f337511595e6" path="/var/lib/kubelet/pods/9a5e127d-52bb-4a33-b163-f337511595e6/volumes" Feb 18 07:06:56 crc kubenswrapper[4707]: I0218 07:06:56.867903 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-frwrf/crc-debug-6sznd"] Feb 18 07:06:56 crc kubenswrapper[4707]: E0218 07:06:56.868370 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5e127d-52bb-4a33-b163-f337511595e6" containerName="container-00" Feb 18 07:06:56 crc kubenswrapper[4707]: I0218 07:06:56.868392 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5e127d-52bb-4a33-b163-f337511595e6" containerName="container-00" Feb 18 07:06:56 crc kubenswrapper[4707]: I0218 07:06:56.868597 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5e127d-52bb-4a33-b163-f337511595e6" containerName="container-00" Feb 18 07:06:56 crc kubenswrapper[4707]: I0218 07:06:56.869229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.054272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-host\") pod \"crc-debug-6sznd\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.054677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5k79\" (UniqueName: \"kubernetes.io/projected/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-kube-api-access-m5k79\") pod \"crc-debug-6sznd\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.157387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-host\") pod \"crc-debug-6sznd\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.157464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5k79\" (UniqueName: \"kubernetes.io/projected/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-kube-api-access-m5k79\") pod \"crc-debug-6sznd\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.157588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-host\") pod \"crc-debug-6sznd\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.175994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5k79\" (UniqueName: \"kubernetes.io/projected/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-kube-api-access-m5k79\") pod \"crc-debug-6sznd\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.186771 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.797962 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786" containerID="41feea00f9abb891c47d61d84e50b5917c2b09b97acd49b8b35e9cb340ea371a" exitCode=0 Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.798069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-6sznd" event={"ID":"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786","Type":"ContainerDied","Data":"41feea00f9abb891c47d61d84e50b5917c2b09b97acd49b8b35e9cb340ea371a"} Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.798280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/crc-debug-6sznd" event={"ID":"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786","Type":"ContainerStarted","Data":"a085a87b4516a35a0ff9c13f80ed01227c421365820a5a43e6fca956aad2c7e1"} Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.838447 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-frwrf/crc-debug-6sznd"] Feb 18 07:06:57 crc kubenswrapper[4707]: I0218 07:06:57.846935 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-frwrf/crc-debug-6sznd"] Feb 18 07:06:58 crc kubenswrapper[4707]: I0218 07:06:58.927246 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.090911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-host\") pod \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.091038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-host" (OuterVolumeSpecName: "host") pod "4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786" (UID: "4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.091303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5k79\" (UniqueName: \"kubernetes.io/projected/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-kube-api-access-m5k79\") pod \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\" (UID: \"4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786\") " Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.091846 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-host\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.096903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-kube-api-access-m5k79" (OuterVolumeSpecName: "kube-api-access-m5k79") pod "4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786" (UID: "4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786"). InnerVolumeSpecName "kube-api-access-m5k79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.193865 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5k79\" (UniqueName: \"kubernetes.io/projected/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786-kube-api-access-m5k79\") on node \"crc\" DevicePath \"\"" Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.817488 4707 scope.go:117] "RemoveContainer" containerID="41feea00f9abb891c47d61d84e50b5917c2b09b97acd49b8b35e9cb340ea371a" Feb 18 07:06:59 crc kubenswrapper[4707]: I0218 07:06:59.817731 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/crc-debug-6sznd" Feb 18 07:07:00 crc kubenswrapper[4707]: I0218 07:07:00.064905 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786" path="/var/lib/kubelet/pods/4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786/volumes" Feb 18 07:07:16 crc kubenswrapper[4707]: I0218 07:07:16.013000 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-fc8b85554-bcs7j_13bd62ec-d5ea-4ad3-8020-0cc244072675/barbican-api/0.log" Feb 18 07:07:16 crc kubenswrapper[4707]: I0218 07:07:16.174645 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-fc8b85554-bcs7j_13bd62ec-d5ea-4ad3-8020-0cc244072675/barbican-api-log/0.log" Feb 18 07:07:16 crc kubenswrapper[4707]: I0218 07:07:16.252201 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-747894f44d-cbjjh_b16501ed-460b-4a5c-8d59-9acddd5e1011/barbican-keystone-listener/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.050858 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-667ddf5c59-6bpbq_06b50daa-d6b3-4865-b224-516392956313/barbican-worker/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.082926 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-747894f44d-cbjjh_b16501ed-460b-4a5c-8d59-9acddd5e1011/barbican-keystone-listener-log/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.110154 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-667ddf5c59-6bpbq_06b50daa-d6b3-4865-b224-516392956313/barbican-worker-log/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.302429 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-24v68_3c7dd778-4759-4515-bbf0-bbc5123e822f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.352614 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/ceilometer-central-agent/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.370441 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/ceilometer-notification-agent/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.515892 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/sg-core/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.559486 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/proxy-httpd/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.703156 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_1b24c097-807d-43e6-aaa5-b9abfb48bff5/ceph/0.log" Feb 18 07:07:17 crc kubenswrapper[4707]: I0218 07:07:17.952532 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6f03e391-db4f-46dd-b206-94e9f6d65e68/cinder-api/0.log" Feb 18 07:07:18 crc kubenswrapper[4707]: I0218 07:07:18.089701 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6f03e391-db4f-46dd-b206-94e9f6d65e68/cinder-api-log/0.log" Feb 18 07:07:18 crc kubenswrapper[4707]: I0218 07:07:18.469463 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a75b087d-214f-4fed-a30c-d0d4f5607a08/probe/0.log" Feb 18 07:07:18 crc kubenswrapper[4707]: I0218 07:07:18.471201 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e24e699-659c-4701-9459-133197b510d7/cinder-scheduler/0.log" Feb 18 07:07:18 crc kubenswrapper[4707]: I0218 07:07:18.720202 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e24e699-659c-4701-9459-133197b510d7/probe/0.log" Feb 18 07:07:18 crc kubenswrapper[4707]: I0218 07:07:18.720775 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a75b087d-214f-4fed-a30c-d0d4f5607a08/cinder-backup/0.log" Feb 18 07:07:18 crc kubenswrapper[4707]: I0218 07:07:18.969118 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_62efca2e-66ee-443d-910e-eb9c22f0536f/probe/0.log" Feb 18 07:07:19 crc kubenswrapper[4707]: I0218 07:07:19.022867 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-557tm_18d9274e-1766-4a10-9522-568030d5db64/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:19 crc kubenswrapper[4707]: I0218 07:07:19.178870 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-grlvs_beb9134a-dfca-4e8d-be56-0e0980d32bc8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:19 crc kubenswrapper[4707]: I0218 07:07:19.418896 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d474c7589-z56p2_0341405c-1d6a-4750-b7c5-07ae9825d4b6/init/0.log" Feb 18 07:07:19 crc kubenswrapper[4707]: I0218 07:07:19.625245 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d474c7589-z56p2_0341405c-1d6a-4750-b7c5-07ae9825d4b6/init/0.log" Feb 18 07:07:19 crc kubenswrapper[4707]: I0218 07:07:19.817003 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d474c7589-z56p2_0341405c-1d6a-4750-b7c5-07ae9825d4b6/dnsmasq-dns/0.log" Feb 18 07:07:19 crc kubenswrapper[4707]: I0218 07:07:19.873987 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xtr45_d7bc2edd-9db2-40df-be54-0db1c1b462fa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.061055 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b/glance-log/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.100334 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b/glance-httpd/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.287756 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_36b54f53-a798-4b8c-99ab-773ba732530b/glance-httpd/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.307485 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_36b54f53-a798-4b8c-99ab-773ba732530b/glance-log/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.742230 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-rvws6_89235767-8eea-43b0-9b2e-cf7fc766a260/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.830399 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77db99878b-h8xzs_6aa9efa8-e6b5-4307-89b1-8a67547a35e9/horizon/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.883820 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_62efca2e-66ee-443d-910e-eb9c22f0536f/cinder-volume/0.log" Feb 18 07:07:20 crc kubenswrapper[4707]: I0218 07:07:20.964694 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-45gf9_e3939063-5ede-47de-8c02-a46756c148b5/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:21 crc kubenswrapper[4707]: I0218 07:07:21.191697 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29523301-k4v49_8fcf8387-c297-4bb6-acc4-810bb4fab9e5/keystone-cron/0.log" Feb 18 07:07:21 crc kubenswrapper[4707]: I0218 07:07:21.208834 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77db99878b-h8xzs_6aa9efa8-e6b5-4307-89b1-8a67547a35e9/horizon-log/0.log" Feb 18 07:07:21 crc kubenswrapper[4707]: I0218 07:07:21.445866 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e2c446d7-5c5f-40e6-831d-4c3e6c75d13d/kube-state-metrics/0.log" Feb 18 07:07:21 crc kubenswrapper[4707]: I0218 07:07:21.637718 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv_603500de-24c1-4ef6-a13a-24646a085b58/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:22 crc kubenswrapper[4707]: I0218 07:07:22.086577 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_04bdc22d-7e6e-428b-849a-45c041654404/probe/0.log" Feb 18 07:07:22 crc kubenswrapper[4707]: I0218 07:07:22.124650 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_79366b7f-24dd-4217-b2af-7350751ce6d3/manila-api/0.log" Feb 18 07:07:22 crc kubenswrapper[4707]: I0218 07:07:22.185865 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_04bdc22d-7e6e-428b-849a-45c041654404/manila-scheduler/0.log" Feb 18 07:07:22 crc kubenswrapper[4707]: I0218 07:07:22.435644 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b/probe/0.log" Feb 18 07:07:22 crc kubenswrapper[4707]: I0218 07:07:22.736394 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b/manila-share/0.log" Feb 18 07:07:22 crc kubenswrapper[4707]: I0218 07:07:22.794785 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_79366b7f-24dd-4217-b2af-7350751ce6d3/manila-api-log/0.log" Feb 18 07:07:23 crc kubenswrapper[4707]: I0218 07:07:23.391061 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh_cf9b64ea-e740-4b80-b899-5f856afdd9c7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:23 crc kubenswrapper[4707]: I0218 07:07:23.668398 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6998659dbc-vmh65_0cb3d300-55b7-4aea-b732-2ab9a36ace83/neutron-httpd/0.log" Feb 18 07:07:24 crc kubenswrapper[4707]: I0218 07:07:24.281066 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6998659dbc-vmh65_0cb3d300-55b7-4aea-b732-2ab9a36ace83/neutron-api/0.log" Feb 18 07:07:25 crc kubenswrapper[4707]: I0218 07:07:25.059268 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6cc5d5b844-m7q6c_d9158ecc-f6e5-4c3f-a7e8-9195a34648b3/keystone-api/0.log" Feb 18 07:07:25 crc kubenswrapper[4707]: I0218 07:07:25.235357 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e596f7ea-65d0-41e8-8469-bf3aace5ed9a/nova-cell0-conductor-conductor/0.log" Feb 18 07:07:25 crc kubenswrapper[4707]: I0218 07:07:25.745534 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_95d9aeec-f182-49b1-9064-352e3bd2fe9b/nova-cell1-conductor-conductor/0.log" Feb 18 07:07:26 crc kubenswrapper[4707]: I0218 07:07:26.219633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ff7a9ac0-e9b0-4497-ad55-18768ff36da1/nova-api-log/0.log" Feb 18 07:07:26 crc kubenswrapper[4707]: I0218 07:07:26.557152 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_891ed851-3533-43e4-a60b-791e4ebd0afa/nova-cell1-novncproxy-novncproxy/0.log" Feb 18 07:07:26 crc kubenswrapper[4707]: I0218 07:07:26.591388 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dck9h_4f8eff2f-2ca1-4fe4-8138-333c62468b97/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:26 crc kubenswrapper[4707]: I0218 07:07:26.866952 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ff7a9ac0-e9b0-4497-ad55-18768ff36da1/nova-api-api/0.log" Feb 18 07:07:26 crc kubenswrapper[4707]: I0218 07:07:26.929146 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ca2ba934-fce0-4bc1-af4e-27d758f7aef6/nova-metadata-log/0.log" Feb 18 07:07:27 crc kubenswrapper[4707]: I0218 07:07:27.217969 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_28c5a172-7c7d-407a-b727-0f982f82680c/mysql-bootstrap/0.log" Feb 18 07:07:27 crc kubenswrapper[4707]: I0218 07:07:27.430493 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_28c5a172-7c7d-407a-b727-0f982f82680c/galera/0.log" Feb 18 07:07:27 crc kubenswrapper[4707]: I0218 07:07:27.443860 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_28c5a172-7c7d-407a-b727-0f982f82680c/mysql-bootstrap/0.log" Feb 18 07:07:27 crc kubenswrapper[4707]: I0218 07:07:27.617653 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_cda1514b-6a18-4c59-8d92-4168f4dc589f/nova-scheduler-scheduler/0.log" Feb 18 07:07:27 crc kubenswrapper[4707]: I0218 07:07:27.778063 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ee6297b-9af9-40fd-90e0-edcb0c08f6e8/mysql-bootstrap/0.log" Feb 18 07:07:28 crc kubenswrapper[4707]: I0218 07:07:28.681959 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ee6297b-9af9-40fd-90e0-edcb0c08f6e8/mysql-bootstrap/0.log" Feb 18 07:07:28 crc kubenswrapper[4707]: I0218 07:07:28.739091 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ee6297b-9af9-40fd-90e0-edcb0c08f6e8/galera/0.log" Feb 18 07:07:28 crc kubenswrapper[4707]: I0218 07:07:28.850231 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ca2ba934-fce0-4bc1-af4e-27d758f7aef6/nova-metadata-metadata/0.log" Feb 18 07:07:28 crc kubenswrapper[4707]: I0218 07:07:28.856603 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_73dac699-5199-47dc-b173-8df7813c1ad4/openstackclient/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.015609 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-297xq_ea5baf83-32e6-41ec-b14a-d32b3f848be6/ovn-controller/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.077604 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-m2jp7_866d1055-f899-4a65-a353-366bf3a303bf/openstack-network-exporter/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.247830 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovsdb-server-init/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.462826 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovsdb-server-init/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.519595 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovs-vswitchd/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.529604 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovsdb-server/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.683971 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xvlwh_09e0f47e-9057-4b18-ba9a-41b34b4fe425/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.787239 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_250e525d-abb2-4374-89d4-3b16602fc351/openstack-network-exporter/0.log" Feb 18 07:07:29 crc kubenswrapper[4707]: I0218 07:07:29.832154 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_250e525d-abb2-4374-89d4-3b16602fc351/ovn-northd/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.005870 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_27a6797f-d647-4727-ac53-df0b6d7495ca/openstack-network-exporter/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.063104 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_27a6797f-d647-4727-ac53-df0b6d7495ca/ovsdbserver-nb/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.292813 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c2df84df-113d-42d3-b7e7-ee6d01888dd9/ovsdbserver-sb/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.374776 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c2df84df-113d-42d3-b7e7-ee6d01888dd9/openstack-network-exporter/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.672162 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d/setup-container/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.730462 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-986f6fbf8-z89c7_9d4c03f6-2a0f-460b-9b68-50838289b469/placement-api/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.930603 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d/rabbitmq/0.log" Feb 18 07:07:30 crc kubenswrapper[4707]: I0218 07:07:30.942839 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d/setup-container/0.log" Feb 18 07:07:31 crc kubenswrapper[4707]: I0218 07:07:31.009861 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-986f6fbf8-z89c7_9d4c03f6-2a0f-460b-9b68-50838289b469/placement-log/0.log" Feb 18 07:07:31 crc kubenswrapper[4707]: I0218 07:07:31.179198 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b14ae66-3d41-476b-9ca7-2490e36de0aa/setup-container/0.log" Feb 18 07:07:31 crc kubenswrapper[4707]: I0218 07:07:31.358535 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b14ae66-3d41-476b-9ca7-2490e36de0aa/rabbitmq/0.log" Feb 18 07:07:31 crc kubenswrapper[4707]: I0218 07:07:31.382322 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b14ae66-3d41-476b-9ca7-2490e36de0aa/setup-container/0.log" Feb 18 07:07:31 crc kubenswrapper[4707]: I0218 07:07:31.470604 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8_e81fc37d-6fb1-4a43-b632-cec42f602002/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:31 crc kubenswrapper[4707]: I0218 07:07:31.717528 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4hc8q_668c00e7-edea-47b0-a904-961fb756cb1d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:31 crc kubenswrapper[4707]: I0218 07:07:31.877831 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t_4ccb192f-200d-453b-8829-3cdaddb0987b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:32 crc kubenswrapper[4707]: I0218 07:07:32.240446 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5rj9p_eda30b1a-96f0-425e-908d-4846ffe8c3bb/ssh-known-hosts-edpm-deployment/0.log" Feb 18 07:07:32 crc kubenswrapper[4707]: I0218 07:07:32.285512 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-w7kvz_72503a5f-0b97-4eee-b0d1-7f9621b6917c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:32 crc kubenswrapper[4707]: I0218 07:07:32.556783 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-878756b99-xx5vn_a6b4c749-b753-42b9-8bc7-fb25121f0ea8/proxy-server/0.log" Feb 18 07:07:32 crc kubenswrapper[4707]: I0218 07:07:32.726280 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-878756b99-xx5vn_a6b4c749-b753-42b9-8bc7-fb25121f0ea8/proxy-httpd/0.log" Feb 18 07:07:32 crc kubenswrapper[4707]: I0218 07:07:32.732054 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rbttv_4ff550e2-53ae-4f38-98d1-e95da8f7bde6/swift-ring-rebalance/0.log" Feb 18 07:07:32 crc kubenswrapper[4707]: I0218 07:07:32.839065 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-auditor/0.log" Feb 18 07:07:32 crc kubenswrapper[4707]: I0218 07:07:32.937932 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-reaper/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.060733 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-replicator/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.151755 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-auditor/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.195312 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-server/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.221750 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-replicator/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.301150 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-server/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.377979 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-updater/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.469267 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-expirer/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.554836 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-auditor/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.555090 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-replicator/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.627002 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-server/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.715772 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-updater/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.795017 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/swift-recon-cron/0.log" Feb 18 07:07:33 crc kubenswrapper[4707]: I0218 07:07:33.841321 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/rsync/0.log" Feb 18 07:07:34 crc kubenswrapper[4707]: I0218 07:07:34.026451 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l_6afe228a-638b-41a3-ba74-556fbc740148/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:34 crc kubenswrapper[4707]: I0218 07:07:34.266091 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e369f41f-534e-48ee-bdcb-da26b742cfc3/tempest-tests-tempest-tests-runner/0.log" Feb 18 07:07:34 crc kubenswrapper[4707]: I0218 07:07:34.274962 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3c04cb22-866a-403f-9a07-1a12cfd909e2/test-operator-logs-container/0.log" Feb 18 07:07:34 crc kubenswrapper[4707]: I0218 07:07:34.500309 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl_6cfdc829-6a01-4b1b-b774-5b7a0ff96d68/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:07:42 crc kubenswrapper[4707]: I0218 07:07:42.235842 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4a3a1b52-c364-480e-a60b-8bc313f3002d/memcached/0.log" Feb 18 07:08:01 crc kubenswrapper[4707]: I0218 07:08:01.288539 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/util/0.log" Feb 18 07:08:01 crc kubenswrapper[4707]: I0218 07:08:01.458182 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/pull/0.log" Feb 18 07:08:01 crc kubenswrapper[4707]: I0218 07:08:01.459743 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/util/0.log" Feb 18 07:08:01 crc kubenswrapper[4707]: I0218 07:08:01.499708 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/pull/0.log" Feb 18 07:08:01 crc kubenswrapper[4707]: I0218 07:08:01.782512 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/pull/0.log" Feb 18 07:08:01 crc kubenswrapper[4707]: I0218 07:08:01.786655 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/util/0.log" Feb 18 07:08:01 crc kubenswrapper[4707]: I0218 07:08:01.789258 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/extract/0.log" Feb 18 07:08:02 crc kubenswrapper[4707]: I0218 07:08:02.234743 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-6lrh7_bc6f5234-aab6-43ea-89e1-a3f785742a89/manager/0.log" Feb 18 07:08:02 crc kubenswrapper[4707]: I0218 07:08:02.589808 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-fnz67_8f61ada5-7374-4801-89b2-c95aec2e52ab/manager/0.log" Feb 18 07:08:02 crc kubenswrapper[4707]: I0218 07:08:02.660953 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-244nk_6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742/manager/0.log" Feb 18 07:08:02 crc kubenswrapper[4707]: I0218 07:08:02.878681 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-85hj5_1a236879-9c6a-4604-b5bc-024b7dfd5161/manager/0.log" Feb 18 07:08:03 crc kubenswrapper[4707]: I0218 07:08:03.457871 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-j5dft_93d80e73-44d0-4db8-8a43-ee2cc8b7e399/manager/0.log" Feb 18 07:08:03 crc kubenswrapper[4707]: I0218 07:08:03.511594 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-8gngt_8ed2f5cf-84b8-4a09-b76f-a60bcb055a04/manager/0.log" Feb 18 07:08:03 crc kubenswrapper[4707]: I0218 07:08:03.862972 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-6f96w_274d7d14-4ef9-47b8-8a2e-07e7a2bb9850/manager/0.log" Feb 18 07:08:04 crc kubenswrapper[4707]: I0218 07:08:04.188663 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-c5s9h_dc8762c9-27f5-476e-840f-815aa3736e85/manager/0.log" Feb 18 07:08:04 crc kubenswrapper[4707]: I0218 07:08:04.440585 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-v6v5m_4c759f5c-da54-44e9-8dec-5f2622419af9/manager/0.log" Feb 18 07:08:04 crc kubenswrapper[4707]: I0218 07:08:04.575302 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-rnhbn_4fcd0bf8-cf6a-45c0-862b-5554daa34c21/manager/0.log" Feb 18 07:08:04 crc kubenswrapper[4707]: I0218 07:08:04.702760 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-2hv82_5442f037-ff83-40b8-9c3f-c73c227effde/manager/0.log" Feb 18 07:08:04 crc kubenswrapper[4707]: I0218 07:08:04.925744 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-twhwz_dc38a034-90cc-4976-93dd-ae54d298b574/manager/0.log" Feb 18 07:08:05 crc kubenswrapper[4707]: I0218 07:08:05.127299 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd_8078f629-a80e-4f59-b84a-33144cc5b0c6/manager/0.log" Feb 18 07:08:05 crc kubenswrapper[4707]: I0218 07:08:05.563204 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-766dc4fc6-q9dtp_b70a612f-7e0b-4187-82b0-404c913ce3d4/operator/0.log" Feb 18 07:08:05 crc kubenswrapper[4707]: I0218 07:08:05.760780 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nnvqj_ccede39b-e3bf-4e86-9b9e-bbdc1b13a349/registry-server/0.log" Feb 18 07:08:06 crc kubenswrapper[4707]: I0218 07:08:06.013592 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-9bb9z_1332f158-2c06-4a3d-9ca9-2dc667c471ba/manager/0.log" Feb 18 07:08:06 crc kubenswrapper[4707]: I0218 07:08:06.238724 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-48k7j_398bbd80-3377-4b8e-b9cd-bdb3a76167ca/manager/0.log" Feb 18 07:08:06 crc kubenswrapper[4707]: I0218 07:08:06.441533 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4qmcg_97e7c996-241f-4732-9e68-a371d114f664/operator/0.log" Feb 18 07:08:06 crc kubenswrapper[4707]: I0218 07:08:06.652148 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-tj2tj_78a96912-db1a-42b8-80aa-7800f28fb0c2/manager/0.log" Feb 18 07:08:06 crc kubenswrapper[4707]: I0218 07:08:06.954182 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-ln7bk_7f2692c0-280b-4449-ac2d-6a9da6eafebe/manager/0.log" Feb 18 07:08:07 crc kubenswrapper[4707]: I0218 07:08:07.469166 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-dccc9b448-chjxf_c8f9f5f4-3cdb-4b04-bc52-26acb4dda227/manager/0.log" Feb 18 07:08:07 crc kubenswrapper[4707]: I0218 07:08:07.638291 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-2dp4d_d4364aee-09c0-49d9-8f50-60e48ecb7d08/manager/0.log" Feb 18 07:08:07 crc kubenswrapper[4707]: I0218 07:08:07.771469 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-4d59f_890576c4-79c6-40dc-b786-0fb2055a1a3e/manager/0.log" Feb 18 07:08:07 crc kubenswrapper[4707]: I0218 07:08:07.791074 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-xbl8j_77cee8d8-c1d5-4743-a6c0-478b7c16e991/manager/0.log" Feb 18 07:08:12 crc kubenswrapper[4707]: I0218 07:08:12.706449 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-qj27r_67bdd3cc-ee7d-4e79-8568-75502788aa1d/manager/0.log" Feb 18 07:08:30 crc kubenswrapper[4707]: I0218 07:08:30.320482 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tnwns_e09b1e8f-752e-42dc-a638-cc7ac7179f83/control-plane-machine-set-operator/0.log" Feb 18 07:08:30 crc kubenswrapper[4707]: I0218 07:08:30.502166 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g4hvr_b7a4eced-46b2-4002-964d-490b0ad2acd3/kube-rbac-proxy/0.log" Feb 18 07:08:30 crc kubenswrapper[4707]: I0218 07:08:30.515703 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g4hvr_b7a4eced-46b2-4002-964d-490b0ad2acd3/machine-api-operator/0.log" Feb 18 07:08:44 crc kubenswrapper[4707]: I0218 07:08:44.062500 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fvczt_ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0/cert-manager-controller/0.log" Feb 18 07:08:44 crc kubenswrapper[4707]: I0218 07:08:44.221752 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-hr4ts_e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f/cert-manager-cainjector/0.log" Feb 18 07:08:44 crc kubenswrapper[4707]: I0218 07:08:44.233486 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dgt7d_07e55359-ea37-4736-a571-315823908633/cert-manager-webhook/0.log" Feb 18 07:08:51 crc kubenswrapper[4707]: I0218 07:08:51.382040 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:08:51 crc kubenswrapper[4707]: I0218 07:08:51.382604 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:08:55 crc kubenswrapper[4707]: I0218 07:08:55.453927 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-8xc9q_fd51f090-835b-4d6f-9204-1564b2430039/nmstate-console-plugin/0.log" Feb 18 07:08:55 crc kubenswrapper[4707]: I0218 07:08:55.745594 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dfp8q_7c0f897b-dd2e-4356-ae9d-a85bae401266/nmstate-handler/0.log" Feb 18 07:08:55 crc kubenswrapper[4707]: I0218 07:08:55.778847 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qmfqj_07ea2b2b-697c-491e-89fe-707d7a2f6a32/kube-rbac-proxy/0.log" Feb 18 07:08:55 crc kubenswrapper[4707]: I0218 07:08:55.843777 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qmfqj_07ea2b2b-697c-491e-89fe-707d7a2f6a32/nmstate-metrics/0.log" Feb 18 07:08:56 crc kubenswrapper[4707]: I0218 07:08:56.024580 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-cn2n8_ff6bca46-e4fa-443f-82c7-7995a2b6499b/nmstate-webhook/0.log" Feb 18 07:08:56 crc kubenswrapper[4707]: I0218 07:08:56.033265 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-wjprs_69e29cb2-d836-4c53-81e6-1d387d6202b9/nmstate-operator/0.log" Feb 18 07:09:20 crc kubenswrapper[4707]: I0218 07:09:20.325312 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-4gf26_9027d820-8aca-4e3b-84f0-4b81be566548/kube-rbac-proxy/0.log" Feb 18 07:09:20 crc kubenswrapper[4707]: I0218 07:09:20.536410 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-4gf26_9027d820-8aca-4e3b-84f0-4b81be566548/controller/0.log" Feb 18 07:09:20 crc kubenswrapper[4707]: I0218 07:09:20.614816 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:09:20 crc kubenswrapper[4707]: I0218 07:09:20.762192 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:09:20 crc kubenswrapper[4707]: I0218 07:09:20.768195 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:09:20 crc kubenswrapper[4707]: I0218 07:09:20.771952 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:09:20 crc kubenswrapper[4707]: I0218 07:09:20.795904 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.010332 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.015463 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.021613 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.038104 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.171492 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.179568 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.225061 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.329526 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/controller/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.350671 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/frr-metrics/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.381751 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.381819 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.456012 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/kube-rbac-proxy/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.575696 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/kube-rbac-proxy-frr/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.586279 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/reloader/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.778836 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-kt4g9_a3bf8bf8-cafc-49e2-b284-33d016f8bb50/frr-k8s-webhook-server/0.log" Feb 18 07:09:21 crc kubenswrapper[4707]: I0218 07:09:21.961669 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7cd6fc9664-wtj2x_a755d713-37ff-463f-81d6-aa0bfc05c654/manager/0.log" Feb 18 07:09:22 crc kubenswrapper[4707]: I0218 07:09:22.095453 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79944b854-l7jrs_44e90078-03e0-4691-ba27-cbd9c5ab9cbe/webhook-server/0.log" Feb 18 07:09:22 crc kubenswrapper[4707]: I0218 07:09:22.227781 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lh4s7_c3112677-66f0-45d3-9281-094cd5c11163/kube-rbac-proxy/0.log" Feb 18 07:09:23 crc kubenswrapper[4707]: I0218 07:09:23.056046 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lh4s7_c3112677-66f0-45d3-9281-094cd5c11163/speaker/0.log" Feb 18 07:09:23 crc kubenswrapper[4707]: I0218 07:09:23.197629 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/frr/0.log" Feb 18 07:09:36 crc kubenswrapper[4707]: I0218 07:09:36.263661 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/util/0.log" Feb 18 07:09:36 crc kubenswrapper[4707]: I0218 07:09:36.439296 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/util/0.log" Feb 18 07:09:36 crc kubenswrapper[4707]: I0218 07:09:36.454947 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/pull/0.log" Feb 18 07:09:36 crc kubenswrapper[4707]: I0218 07:09:36.521717 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/pull/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.315891 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/extract/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.322271 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/util/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.323904 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/pull/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.522271 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-utilities/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.693838 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-utilities/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.707744 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-content/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.745369 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-content/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.906474 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-content/0.log" Feb 18 07:09:37 crc kubenswrapper[4707]: I0218 07:09:37.919299 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-utilities/0.log" Feb 18 07:09:38 crc kubenswrapper[4707]: I0218 07:09:38.166142 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-utilities/0.log" Feb 18 07:09:38 crc kubenswrapper[4707]: I0218 07:09:38.421842 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-content/0.log" Feb 18 07:09:38 crc kubenswrapper[4707]: I0218 07:09:38.453161 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-utilities/0.log" Feb 18 07:09:38 crc kubenswrapper[4707]: I0218 07:09:38.453389 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-content/0.log" Feb 18 07:09:38 crc kubenswrapper[4707]: I0218 07:09:38.696988 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-content/0.log" Feb 18 07:09:38 crc kubenswrapper[4707]: I0218 07:09:38.736121 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/registry-server/0.log" Feb 18 07:09:38 crc kubenswrapper[4707]: I0218 07:09:38.737442 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-utilities/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.052540 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/util/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.136007 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/registry-server/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.222277 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/util/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.267724 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/pull/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.267983 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/pull/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.438763 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/util/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.480326 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/pull/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.496219 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/extract/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.624201 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z7xmc_ef5bee5f-c0c3-471e-88fb-43735b7c0b31/marketplace-operator/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.713105 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-utilities/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.900877 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-content/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.904647 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-content/0.log" Feb 18 07:09:39 crc kubenswrapper[4707]: I0218 07:09:39.920506 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-utilities/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.097058 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-content/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.107272 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-utilities/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.293769 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/registry-server/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.346475 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-utilities/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.544552 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-content/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.550056 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-utilities/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.554866 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-content/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.707758 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-utilities/0.log" Feb 18 07:09:40 crc kubenswrapper[4707]: I0218 07:09:40.741725 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-content/0.log" Feb 18 07:09:41 crc kubenswrapper[4707]: I0218 07:09:41.408103 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/registry-server/0.log" Feb 18 07:09:51 crc kubenswrapper[4707]: I0218 07:09:51.382583 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:09:51 crc kubenswrapper[4707]: I0218 07:09:51.383153 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:09:51 crc kubenswrapper[4707]: I0218 07:09:51.383195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 07:09:51 crc kubenswrapper[4707]: I0218 07:09:51.383860 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 07:09:51 crc kubenswrapper[4707]: I0218 07:09:51.383902 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" gracePeriod=600 Feb 18 07:09:51 crc kubenswrapper[4707]: E0218 07:09:51.508432 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:09:52 crc kubenswrapper[4707]: I0218 07:09:52.403694 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" exitCode=0 Feb 18 07:09:52 crc kubenswrapper[4707]: I0218 07:09:52.403749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8"} Feb 18 07:09:52 crc kubenswrapper[4707]: I0218 07:09:52.404389 4707 scope.go:117] "RemoveContainer" containerID="75729da4be911af2deb616fb3bb9270ac6b0fe680ac28da2d1b09117643f8e74" Feb 18 07:09:52 crc kubenswrapper[4707]: I0218 07:09:52.405249 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:09:52 crc kubenswrapper[4707]: E0218 07:09:52.405666 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:10:06 crc kubenswrapper[4707]: I0218 07:10:06.053739 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:10:06 crc kubenswrapper[4707]: E0218 07:10:06.054638 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:10:18 crc kubenswrapper[4707]: I0218 07:10:18.054366 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:10:18 crc kubenswrapper[4707]: E0218 07:10:18.055226 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.715636 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mvs6w"] Feb 18 07:10:21 crc kubenswrapper[4707]: E0218 07:10:21.716567 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786" containerName="container-00" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.716585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786" containerName="container-00" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.716768 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6ceb6a-9ee2-46f0-a4f3-d9a639b81786" containerName="container-00" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.718149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.737709 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvs6w"] Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.830920 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-utilities\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.831051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-catalog-content\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.831137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dn2j\" (UniqueName: \"kubernetes.io/projected/906d4ac2-fe98-48ab-9a08-b9a5de5300af-kube-api-access-5dn2j\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.933437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-utilities\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.933894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-catalog-content\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.933980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dn2j\" (UniqueName: \"kubernetes.io/projected/906d4ac2-fe98-48ab-9a08-b9a5de5300af-kube-api-access-5dn2j\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.934009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-utilities\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.934383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-catalog-content\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:21 crc kubenswrapper[4707]: I0218 07:10:21.953589 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dn2j\" (UniqueName: \"kubernetes.io/projected/906d4ac2-fe98-48ab-9a08-b9a5de5300af-kube-api-access-5dn2j\") pod \"community-operators-mvs6w\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:22 crc kubenswrapper[4707]: I0218 07:10:22.036280 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:22 crc kubenswrapper[4707]: I0218 07:10:22.567621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvs6w"] Feb 18 07:10:22 crc kubenswrapper[4707]: I0218 07:10:22.701943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvs6w" event={"ID":"906d4ac2-fe98-48ab-9a08-b9a5de5300af","Type":"ContainerStarted","Data":"0f10473182fb031da92e75aba2b0c5032818263012a21ae04147f48113db3f30"} Feb 18 07:10:23 crc kubenswrapper[4707]: I0218 07:10:23.724512 4707 generic.go:334] "Generic (PLEG): container finished" podID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerID="9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7" exitCode=0 Feb 18 07:10:23 crc kubenswrapper[4707]: I0218 07:10:23.724746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvs6w" event={"ID":"906d4ac2-fe98-48ab-9a08-b9a5de5300af","Type":"ContainerDied","Data":"9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7"} Feb 18 07:10:23 crc kubenswrapper[4707]: I0218 07:10:23.728457 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 07:10:24 crc kubenswrapper[4707]: I0218 07:10:24.734846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvs6w" event={"ID":"906d4ac2-fe98-48ab-9a08-b9a5de5300af","Type":"ContainerStarted","Data":"1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607"} Feb 18 07:10:25 crc kubenswrapper[4707]: I0218 07:10:25.743516 4707 generic.go:334] "Generic (PLEG): container finished" podID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerID="1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607" exitCode=0 Feb 18 07:10:25 crc kubenswrapper[4707]: I0218 07:10:25.743672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvs6w" event={"ID":"906d4ac2-fe98-48ab-9a08-b9a5de5300af","Type":"ContainerDied","Data":"1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607"} Feb 18 07:10:26 crc kubenswrapper[4707]: I0218 07:10:26.755607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvs6w" event={"ID":"906d4ac2-fe98-48ab-9a08-b9a5de5300af","Type":"ContainerStarted","Data":"69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09"} Feb 18 07:10:31 crc kubenswrapper[4707]: I0218 07:10:31.053486 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:10:31 crc kubenswrapper[4707]: E0218 07:10:31.054393 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:10:32 crc kubenswrapper[4707]: I0218 07:10:32.038805 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:32 crc kubenswrapper[4707]: I0218 07:10:32.039191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:32 crc kubenswrapper[4707]: I0218 07:10:32.092730 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:32 crc kubenswrapper[4707]: I0218 07:10:32.116159 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mvs6w" podStartSLOduration=8.720430464 podStartE2EDuration="11.116140134s" podCreationTimestamp="2026-02-18 07:10:21 +0000 UTC" firstStartedPulling="2026-02-18 07:10:23.727835728 +0000 UTC m=+4960.375794872" lastFinishedPulling="2026-02-18 07:10:26.123545398 +0000 UTC m=+4962.771504542" observedRunningTime="2026-02-18 07:10:26.780245423 +0000 UTC m=+4963.428204557" watchObservedRunningTime="2026-02-18 07:10:32.116140134 +0000 UTC m=+4968.764099268" Feb 18 07:10:32 crc kubenswrapper[4707]: I0218 07:10:32.850875 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:32 crc kubenswrapper[4707]: I0218 07:10:32.894353 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvs6w"] Feb 18 07:10:34 crc kubenswrapper[4707]: I0218 07:10:34.822403 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mvs6w" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="registry-server" containerID="cri-o://69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09" gracePeriod=2 Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.276663 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.306368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-catalog-content\") pod \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.306471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dn2j\" (UniqueName: \"kubernetes.io/projected/906d4ac2-fe98-48ab-9a08-b9a5de5300af-kube-api-access-5dn2j\") pod \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.306547 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-utilities\") pod \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\" (UID: \"906d4ac2-fe98-48ab-9a08-b9a5de5300af\") " Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.307760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-utilities" (OuterVolumeSpecName: "utilities") pod "906d4ac2-fe98-48ab-9a08-b9a5de5300af" (UID: "906d4ac2-fe98-48ab-9a08-b9a5de5300af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.318994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906d4ac2-fe98-48ab-9a08-b9a5de5300af-kube-api-access-5dn2j" (OuterVolumeSpecName: "kube-api-access-5dn2j") pod "906d4ac2-fe98-48ab-9a08-b9a5de5300af" (UID: "906d4ac2-fe98-48ab-9a08-b9a5de5300af"). InnerVolumeSpecName "kube-api-access-5dn2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.408194 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.408221 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dn2j\" (UniqueName: \"kubernetes.io/projected/906d4ac2-fe98-48ab-9a08-b9a5de5300af-kube-api-access-5dn2j\") on node \"crc\" DevicePath \"\"" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.604192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "906d4ac2-fe98-48ab-9a08-b9a5de5300af" (UID: "906d4ac2-fe98-48ab-9a08-b9a5de5300af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.611863 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906d4ac2-fe98-48ab-9a08-b9a5de5300af-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.835383 4707 generic.go:334] "Generic (PLEG): container finished" podID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerID="69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09" exitCode=0 Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.835449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvs6w" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.835443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvs6w" event={"ID":"906d4ac2-fe98-48ab-9a08-b9a5de5300af","Type":"ContainerDied","Data":"69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09"} Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.835537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvs6w" event={"ID":"906d4ac2-fe98-48ab-9a08-b9a5de5300af","Type":"ContainerDied","Data":"0f10473182fb031da92e75aba2b0c5032818263012a21ae04147f48113db3f30"} Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.835566 4707 scope.go:117] "RemoveContainer" containerID="69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.857724 4707 scope.go:117] "RemoveContainer" containerID="1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.873568 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvs6w"] Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.883385 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mvs6w"] Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.894839 4707 scope.go:117] "RemoveContainer" containerID="9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.927516 4707 scope.go:117] "RemoveContainer" containerID="69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09" Feb 18 07:10:35 crc kubenswrapper[4707]: E0218 07:10:35.928505 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09\": container with ID starting with 69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09 not found: ID does not exist" containerID="69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.928570 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09"} err="failed to get container status \"69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09\": rpc error: code = NotFound desc = could not find container \"69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09\": container with ID starting with 69222a6be9b5986d669bc3004f51c1171ba87eefddf00d7f31e3650a99b34f09 not found: ID does not exist" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.928604 4707 scope.go:117] "RemoveContainer" containerID="1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607" Feb 18 07:10:35 crc kubenswrapper[4707]: E0218 07:10:35.929943 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607\": container with ID starting with 1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607 not found: ID does not exist" containerID="1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.929977 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607"} err="failed to get container status \"1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607\": rpc error: code = NotFound desc = could not find container \"1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607\": container with ID starting with 1e1205caa3803022543207567f62b31cf80b2e0ef189c7c4125ad25e5cc0a607 not found: ID does not exist" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.930006 4707 scope.go:117] "RemoveContainer" containerID="9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7" Feb 18 07:10:35 crc kubenswrapper[4707]: E0218 07:10:35.930575 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7\": container with ID starting with 9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7 not found: ID does not exist" containerID="9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7" Feb 18 07:10:35 crc kubenswrapper[4707]: I0218 07:10:35.930628 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7"} err="failed to get container status \"9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7\": rpc error: code = NotFound desc = could not find container \"9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7\": container with ID starting with 9483f4377f751473da473613d8161b30e8b74f3701e9299c4c5f565cb7047dc7 not found: ID does not exist" Feb 18 07:10:36 crc kubenswrapper[4707]: I0218 07:10:36.063623 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" path="/var/lib/kubelet/pods/906d4ac2-fe98-48ab-9a08-b9a5de5300af/volumes" Feb 18 07:10:43 crc kubenswrapper[4707]: I0218 07:10:43.053241 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:10:43 crc kubenswrapper[4707]: E0218 07:10:43.054075 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:10:56 crc kubenswrapper[4707]: I0218 07:10:56.056721 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:10:56 crc kubenswrapper[4707]: E0218 07:10:56.058047 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:11:07 crc kubenswrapper[4707]: I0218 07:11:07.054308 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:11:07 crc kubenswrapper[4707]: E0218 07:11:07.055471 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:11:19 crc kubenswrapper[4707]: I0218 07:11:19.053427 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:11:19 crc kubenswrapper[4707]: E0218 07:11:19.054233 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:11:32 crc kubenswrapper[4707]: I0218 07:11:32.054336 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:11:32 crc kubenswrapper[4707]: E0218 07:11:32.055110 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:11:45 crc kubenswrapper[4707]: I0218 07:11:45.053556 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:11:45 crc kubenswrapper[4707]: E0218 07:11:45.055741 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:11:58 crc kubenswrapper[4707]: I0218 07:11:58.053025 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:11:58 crc kubenswrapper[4707]: E0218 07:11:58.053748 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:12:03 crc kubenswrapper[4707]: I0218 07:12:03.586993 4707 generic.go:334] "Generic (PLEG): container finished" podID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerID="36363cb52d45dc1edada9d23f42f1a2ceda1fc8d376433f9d011172a6d0342bd" exitCode=0 Feb 18 07:12:03 crc kubenswrapper[4707]: I0218 07:12:03.587074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-frwrf/must-gather-98km4" event={"ID":"0409d47c-4b51-4b82-86e9-be8f5fc24024","Type":"ContainerDied","Data":"36363cb52d45dc1edada9d23f42f1a2ceda1fc8d376433f9d011172a6d0342bd"} Feb 18 07:12:03 crc kubenswrapper[4707]: I0218 07:12:03.588349 4707 scope.go:117] "RemoveContainer" containerID="36363cb52d45dc1edada9d23f42f1a2ceda1fc8d376433f9d011172a6d0342bd" Feb 18 07:12:04 crc kubenswrapper[4707]: I0218 07:12:04.326296 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-frwrf_must-gather-98km4_0409d47c-4b51-4b82-86e9-be8f5fc24024/gather/0.log" Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.053298 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:12:12 crc kubenswrapper[4707]: E0218 07:12:12.054224 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.290721 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-frwrf/must-gather-98km4"] Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.291293 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-frwrf/must-gather-98km4" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerName="copy" containerID="cri-o://295cb615aacc716f0c84378671a49b8267f27d88d4e7ef7f26ae4ff525a3e237" gracePeriod=2 Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.302164 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-frwrf/must-gather-98km4"] Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.684183 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-frwrf_must-gather-98km4_0409d47c-4b51-4b82-86e9-be8f5fc24024/copy/0.log" Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.684767 4707 generic.go:334] "Generic (PLEG): container finished" podID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerID="295cb615aacc716f0c84378671a49b8267f27d88d4e7ef7f26ae4ff525a3e237" exitCode=143 Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.959293 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-frwrf_must-gather-98km4_0409d47c-4b51-4b82-86e9-be8f5fc24024/copy/0.log" Feb 18 07:12:12 crc kubenswrapper[4707]: I0218 07:12:12.960090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.005751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0409d47c-4b51-4b82-86e9-be8f5fc24024-must-gather-output\") pod \"0409d47c-4b51-4b82-86e9-be8f5fc24024\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.005856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wtqp\" (UniqueName: \"kubernetes.io/projected/0409d47c-4b51-4b82-86e9-be8f5fc24024-kube-api-access-8wtqp\") pod \"0409d47c-4b51-4b82-86e9-be8f5fc24024\" (UID: \"0409d47c-4b51-4b82-86e9-be8f5fc24024\") " Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.012192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0409d47c-4b51-4b82-86e9-be8f5fc24024-kube-api-access-8wtqp" (OuterVolumeSpecName: "kube-api-access-8wtqp") pod "0409d47c-4b51-4b82-86e9-be8f5fc24024" (UID: "0409d47c-4b51-4b82-86e9-be8f5fc24024"). InnerVolumeSpecName "kube-api-access-8wtqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.108367 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wtqp\" (UniqueName: \"kubernetes.io/projected/0409d47c-4b51-4b82-86e9-be8f5fc24024-kube-api-access-8wtqp\") on node \"crc\" DevicePath \"\"" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.206041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0409d47c-4b51-4b82-86e9-be8f5fc24024-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0409d47c-4b51-4b82-86e9-be8f5fc24024" (UID: "0409d47c-4b51-4b82-86e9-be8f5fc24024"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.209948 4707 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0409d47c-4b51-4b82-86e9-be8f5fc24024-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.696640 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-frwrf_must-gather-98km4_0409d47c-4b51-4b82-86e9-be8f5fc24024/copy/0.log" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.697543 4707 scope.go:117] "RemoveContainer" containerID="295cb615aacc716f0c84378671a49b8267f27d88d4e7ef7f26ae4ff525a3e237" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.697612 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-frwrf/must-gather-98km4" Feb 18 07:12:13 crc kubenswrapper[4707]: I0218 07:12:13.719535 4707 scope.go:117] "RemoveContainer" containerID="36363cb52d45dc1edada9d23f42f1a2ceda1fc8d376433f9d011172a6d0342bd" Feb 18 07:12:14 crc kubenswrapper[4707]: I0218 07:12:14.064782 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" path="/var/lib/kubelet/pods/0409d47c-4b51-4b82-86e9-be8f5fc24024/volumes" Feb 18 07:12:25 crc kubenswrapper[4707]: I0218 07:12:25.053656 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:12:25 crc kubenswrapper[4707]: E0218 07:12:25.057289 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:12:40 crc kubenswrapper[4707]: I0218 07:12:40.053823 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:12:40 crc kubenswrapper[4707]: E0218 07:12:40.054570 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:12:54 crc kubenswrapper[4707]: I0218 07:12:54.060219 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:12:54 crc kubenswrapper[4707]: E0218 07:12:54.061173 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:12:57 crc kubenswrapper[4707]: I0218 07:12:57.147776 4707 scope.go:117] "RemoveContainer" containerID="876fc04fe13e167ec6453b0bab5515445b0d7f2e322731088c4e0fefec425934" Feb 18 07:13:09 crc kubenswrapper[4707]: I0218 07:13:09.053838 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:13:09 crc kubenswrapper[4707]: E0218 07:13:09.054585 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:13:23 crc kubenswrapper[4707]: I0218 07:13:23.052981 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:13:23 crc kubenswrapper[4707]: E0218 07:13:23.054703 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:13:34 crc kubenswrapper[4707]: I0218 07:13:34.060726 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:13:34 crc kubenswrapper[4707]: E0218 07:13:34.064340 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:13:46 crc kubenswrapper[4707]: I0218 07:13:46.054649 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:13:46 crc kubenswrapper[4707]: E0218 07:13:46.055469 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:13:58 crc kubenswrapper[4707]: I0218 07:13:58.053177 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:13:58 crc kubenswrapper[4707]: E0218 07:13:58.053880 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:14:12 crc kubenswrapper[4707]: I0218 07:14:12.054986 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:14:12 crc kubenswrapper[4707]: E0218 07:14:12.058482 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:14:23 crc kubenswrapper[4707]: I0218 07:14:23.054681 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:14:23 crc kubenswrapper[4707]: E0218 07:14:23.056166 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:14:36 crc kubenswrapper[4707]: I0218 07:14:36.053932 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:14:36 crc kubenswrapper[4707]: E0218 07:14:36.054633 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:14:48 crc kubenswrapper[4707]: I0218 07:14:48.053399 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:14:48 crc kubenswrapper[4707]: E0218 07:14:48.054409 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.156646 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp"] Feb 18 07:15:00 crc kubenswrapper[4707]: E0218 07:15:00.157712 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerName="gather" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.157827 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerName="gather" Feb 18 07:15:00 crc kubenswrapper[4707]: E0218 07:15:00.157847 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="extract-content" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.157857 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="extract-content" Feb 18 07:15:00 crc kubenswrapper[4707]: E0218 07:15:00.157887 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="registry-server" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.157895 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="registry-server" Feb 18 07:15:00 crc kubenswrapper[4707]: E0218 07:15:00.157912 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="extract-utilities" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.157919 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="extract-utilities" Feb 18 07:15:00 crc kubenswrapper[4707]: E0218 07:15:00.157931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerName="copy" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.157938 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerName="copy" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.158172 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerName="gather" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.158190 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0409d47c-4b51-4b82-86e9-be8f5fc24024" containerName="copy" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.158202 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="906d4ac2-fe98-48ab-9a08-b9a5de5300af" containerName="registry-server" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.159049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.161322 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.161400 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.165437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp"] Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.218468 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzqh\" (UniqueName: \"kubernetes.io/projected/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-kube-api-access-ngzqh\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.218660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-config-volume\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.219189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-secret-volume\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.321638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-config-volume\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.321923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-secret-volume\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.321975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzqh\" (UniqueName: \"kubernetes.io/projected/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-kube-api-access-ngzqh\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.322789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-config-volume\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.339351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-secret-volume\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.340773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzqh\" (UniqueName: \"kubernetes.io/projected/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-kube-api-access-ngzqh\") pod \"collect-profiles-29523315-zc4mp\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.481967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:00 crc kubenswrapper[4707]: I0218 07:15:00.976522 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp"] Feb 18 07:15:01 crc kubenswrapper[4707]: I0218 07:15:01.188858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" event={"ID":"4dae3bb7-69c6-4d8a-aa3f-52be892215d6","Type":"ContainerStarted","Data":"18a9d6608ac38c4b655cd082ea1e2a147958bd5ecc79d8b3e6c367f9416582b0"} Feb 18 07:15:01 crc kubenswrapper[4707]: I0218 07:15:01.189161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" event={"ID":"4dae3bb7-69c6-4d8a-aa3f-52be892215d6","Type":"ContainerStarted","Data":"f59adccc0e6f62c0af85c8d1ed71602f563f6e97e4c929db8bc9d72c93904b65"} Feb 18 07:15:01 crc kubenswrapper[4707]: I0218 07:15:01.214085 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" podStartSLOduration=1.214062038 podStartE2EDuration="1.214062038s" podCreationTimestamp="2026-02-18 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 07:15:01.211502939 +0000 UTC m=+5237.859462073" watchObservedRunningTime="2026-02-18 07:15:01.214062038 +0000 UTC m=+5237.862021172" Feb 18 07:15:02 crc kubenswrapper[4707]: I0218 07:15:02.053640 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:15:02 crc kubenswrapper[4707]: I0218 07:15:02.198109 4707 generic.go:334] "Generic (PLEG): container finished" podID="4dae3bb7-69c6-4d8a-aa3f-52be892215d6" containerID="18a9d6608ac38c4b655cd082ea1e2a147958bd5ecc79d8b3e6c367f9416582b0" exitCode=0 Feb 18 07:15:02 crc kubenswrapper[4707]: I0218 07:15:02.198154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" event={"ID":"4dae3bb7-69c6-4d8a-aa3f-52be892215d6","Type":"ContainerDied","Data":"18a9d6608ac38c4b655cd082ea1e2a147958bd5ecc79d8b3e6c367f9416582b0"} Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.222980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"285a82f9ab065dc06e8aad4a395d2035da5abd7059a2b1dc1270dfccd3ef7a66"} Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.646071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.798461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-secret-volume\") pod \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.798648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngzqh\" (UniqueName: \"kubernetes.io/projected/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-kube-api-access-ngzqh\") pod \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.798713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-config-volume\") pod \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\" (UID: \"4dae3bb7-69c6-4d8a-aa3f-52be892215d6\") " Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.800016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "4dae3bb7-69c6-4d8a-aa3f-52be892215d6" (UID: "4dae3bb7-69c6-4d8a-aa3f-52be892215d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.808472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-kube-api-access-ngzqh" (OuterVolumeSpecName: "kube-api-access-ngzqh") pod "4dae3bb7-69c6-4d8a-aa3f-52be892215d6" (UID: "4dae3bb7-69c6-4d8a-aa3f-52be892215d6"). InnerVolumeSpecName "kube-api-access-ngzqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.820322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4dae3bb7-69c6-4d8a-aa3f-52be892215d6" (UID: "4dae3bb7-69c6-4d8a-aa3f-52be892215d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.901450 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngzqh\" (UniqueName: \"kubernetes.io/projected/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-kube-api-access-ngzqh\") on node \"crc\" DevicePath \"\"" Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.901503 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 07:15:03 crc kubenswrapper[4707]: I0218 07:15:03.901516 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dae3bb7-69c6-4d8a-aa3f-52be892215d6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 07:15:04 crc kubenswrapper[4707]: I0218 07:15:04.233336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" event={"ID":"4dae3bb7-69c6-4d8a-aa3f-52be892215d6","Type":"ContainerDied","Data":"f59adccc0e6f62c0af85c8d1ed71602f563f6e97e4c929db8bc9d72c93904b65"} Feb 18 07:15:04 crc kubenswrapper[4707]: I0218 07:15:04.234442 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f59adccc0e6f62c0af85c8d1ed71602f563f6e97e4c929db8bc9d72c93904b65" Feb 18 07:15:04 crc kubenswrapper[4707]: I0218 07:15:04.233423 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523315-zc4mp" Feb 18 07:15:04 crc kubenswrapper[4707]: I0218 07:15:04.282345 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97"] Feb 18 07:15:04 crc kubenswrapper[4707]: I0218 07:15:04.295046 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523270-fnc97"] Feb 18 07:15:06 crc kubenswrapper[4707]: I0218 07:15:06.065712 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e25207ca-54be-481f-a992-f5de337090f9" path="/var/lib/kubelet/pods/e25207ca-54be-481f-a992-f5de337090f9/volumes" Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.786066 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zj4rl"] Feb 18 07:15:13 crc kubenswrapper[4707]: E0218 07:15:13.787110 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dae3bb7-69c6-4d8a-aa3f-52be892215d6" containerName="collect-profiles" Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.787126 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dae3bb7-69c6-4d8a-aa3f-52be892215d6" containerName="collect-profiles" Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.787386 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dae3bb7-69c6-4d8a-aa3f-52be892215d6" containerName="collect-profiles" Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.788844 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.795591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zj4rl"] Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.914345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-utilities\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.914551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l84tt\" (UniqueName: \"kubernetes.io/projected/613f260f-a319-49b2-b978-4a446334b9b4-kube-api-access-l84tt\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:13 crc kubenswrapper[4707]: I0218 07:15:13.914759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-catalog-content\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.016397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l84tt\" (UniqueName: \"kubernetes.io/projected/613f260f-a319-49b2-b978-4a446334b9b4-kube-api-access-l84tt\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.016552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-catalog-content\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.016578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-utilities\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.017134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-utilities\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.017199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-catalog-content\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.043305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l84tt\" (UniqueName: \"kubernetes.io/projected/613f260f-a319-49b2-b978-4a446334b9b4-kube-api-access-l84tt\") pod \"certified-operators-zj4rl\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.115525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:14 crc kubenswrapper[4707]: I0218 07:15:14.643118 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zj4rl"] Feb 18 07:15:15 crc kubenswrapper[4707]: E0218 07:15:15.012494 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613f260f_a319_49b2_b978_4a446334b9b4.slice/crio-bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613f260f_a319_49b2_b978_4a446334b9b4.slice/crio-conmon-bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a.scope\": RecentStats: unable to find data in memory cache]" Feb 18 07:15:15 crc kubenswrapper[4707]: I0218 07:15:15.336944 4707 generic.go:334] "Generic (PLEG): container finished" podID="613f260f-a319-49b2-b978-4a446334b9b4" containerID="bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a" exitCode=0 Feb 18 07:15:15 crc kubenswrapper[4707]: I0218 07:15:15.337003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj4rl" event={"ID":"613f260f-a319-49b2-b978-4a446334b9b4","Type":"ContainerDied","Data":"bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a"} Feb 18 07:15:15 crc kubenswrapper[4707]: I0218 07:15:15.337034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj4rl" event={"ID":"613f260f-a319-49b2-b978-4a446334b9b4","Type":"ContainerStarted","Data":"647238fb4c6087d99258f889ed22c048ec890334aaf4941a304705862d721014"} Feb 18 07:15:16 crc kubenswrapper[4707]: I0218 07:15:16.347948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj4rl" event={"ID":"613f260f-a319-49b2-b978-4a446334b9b4","Type":"ContainerStarted","Data":"c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b"} Feb 18 07:15:17 crc kubenswrapper[4707]: I0218 07:15:17.360978 4707 generic.go:334] "Generic (PLEG): container finished" podID="613f260f-a319-49b2-b978-4a446334b9b4" containerID="c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b" exitCode=0 Feb 18 07:15:17 crc kubenswrapper[4707]: I0218 07:15:17.361046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj4rl" event={"ID":"613f260f-a319-49b2-b978-4a446334b9b4","Type":"ContainerDied","Data":"c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b"} Feb 18 07:15:18 crc kubenswrapper[4707]: I0218 07:15:18.370594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj4rl" event={"ID":"613f260f-a319-49b2-b978-4a446334b9b4","Type":"ContainerStarted","Data":"3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f"} Feb 18 07:15:18 crc kubenswrapper[4707]: I0218 07:15:18.388604 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zj4rl" podStartSLOduration=2.959114998 podStartE2EDuration="5.388587664s" podCreationTimestamp="2026-02-18 07:15:13 +0000 UTC" firstStartedPulling="2026-02-18 07:15:15.338943694 +0000 UTC m=+5251.986902828" lastFinishedPulling="2026-02-18 07:15:17.76841636 +0000 UTC m=+5254.416375494" observedRunningTime="2026-02-18 07:15:18.38696635 +0000 UTC m=+5255.034925494" watchObservedRunningTime="2026-02-18 07:15:18.388587664 +0000 UTC m=+5255.036546788" Feb 18 07:15:24 crc kubenswrapper[4707]: I0218 07:15:24.116455 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:24 crc kubenswrapper[4707]: I0218 07:15:24.117098 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:24 crc kubenswrapper[4707]: I0218 07:15:24.272545 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:24 crc kubenswrapper[4707]: I0218 07:15:24.466561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:24 crc kubenswrapper[4707]: I0218 07:15:24.523567 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zj4rl"] Feb 18 07:15:26 crc kubenswrapper[4707]: I0218 07:15:26.434551 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zj4rl" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="registry-server" containerID="cri-o://3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f" gracePeriod=2 Feb 18 07:15:26 crc kubenswrapper[4707]: I0218 07:15:26.893340 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.069249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-catalog-content\") pod \"613f260f-a319-49b2-b978-4a446334b9b4\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.069366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l84tt\" (UniqueName: \"kubernetes.io/projected/613f260f-a319-49b2-b978-4a446334b9b4-kube-api-access-l84tt\") pod \"613f260f-a319-49b2-b978-4a446334b9b4\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.069484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-utilities\") pod \"613f260f-a319-49b2-b978-4a446334b9b4\" (UID: \"613f260f-a319-49b2-b978-4a446334b9b4\") " Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.070475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-utilities" (OuterVolumeSpecName: "utilities") pod "613f260f-a319-49b2-b978-4a446334b9b4" (UID: "613f260f-a319-49b2-b978-4a446334b9b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.071527 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.082733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613f260f-a319-49b2-b978-4a446334b9b4-kube-api-access-l84tt" (OuterVolumeSpecName: "kube-api-access-l84tt") pod "613f260f-a319-49b2-b978-4a446334b9b4" (UID: "613f260f-a319-49b2-b978-4a446334b9b4"). InnerVolumeSpecName "kube-api-access-l84tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.173124 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l84tt\" (UniqueName: \"kubernetes.io/projected/613f260f-a319-49b2-b978-4a446334b9b4-kube-api-access-l84tt\") on node \"crc\" DevicePath \"\"" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.383383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "613f260f-a319-49b2-b978-4a446334b9b4" (UID: "613f260f-a319-49b2-b978-4a446334b9b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.444434 4707 generic.go:334] "Generic (PLEG): container finished" podID="613f260f-a319-49b2-b978-4a446334b9b4" containerID="3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f" exitCode=0 Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.444484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj4rl" event={"ID":"613f260f-a319-49b2-b978-4a446334b9b4","Type":"ContainerDied","Data":"3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f"} Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.444516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj4rl" event={"ID":"613f260f-a319-49b2-b978-4a446334b9b4","Type":"ContainerDied","Data":"647238fb4c6087d99258f889ed22c048ec890334aaf4941a304705862d721014"} Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.444527 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj4rl" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.444539 4707 scope.go:117] "RemoveContainer" containerID="3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.473212 4707 scope.go:117] "RemoveContainer" containerID="c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.478247 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613f260f-a319-49b2-b978-4a446334b9b4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.478281 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zj4rl"] Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.490212 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zj4rl"] Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.495146 4707 scope.go:117] "RemoveContainer" containerID="bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.537697 4707 scope.go:117] "RemoveContainer" containerID="3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f" Feb 18 07:15:27 crc kubenswrapper[4707]: E0218 07:15:27.538197 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f\": container with ID starting with 3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f not found: ID does not exist" containerID="3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.538238 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f"} err="failed to get container status \"3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f\": rpc error: code = NotFound desc = could not find container \"3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f\": container with ID starting with 3d8ebf9e29220455fcf5fd2d7aac977af40eb08a5e1fe4bb0bcec0090369518f not found: ID does not exist" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.538266 4707 scope.go:117] "RemoveContainer" containerID="c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b" Feb 18 07:15:27 crc kubenswrapper[4707]: E0218 07:15:27.539085 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b\": container with ID starting with c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b not found: ID does not exist" containerID="c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.539138 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b"} err="failed to get container status \"c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b\": rpc error: code = NotFound desc = could not find container \"c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b\": container with ID starting with c550f22a51d773275b1c39915015a510154ecf447325f57e590822d00217a54b not found: ID does not exist" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.539172 4707 scope.go:117] "RemoveContainer" containerID="bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a" Feb 18 07:15:27 crc kubenswrapper[4707]: E0218 07:15:27.539516 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a\": container with ID starting with bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a not found: ID does not exist" containerID="bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a" Feb 18 07:15:27 crc kubenswrapper[4707]: I0218 07:15:27.539548 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a"} err="failed to get container status \"bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a\": rpc error: code = NotFound desc = could not find container \"bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a\": container with ID starting with bd9f349058f494d2621f3120b3451a6c62ae78687703b68f1324717b661bfa9a not found: ID does not exist" Feb 18 07:15:28 crc kubenswrapper[4707]: I0218 07:15:28.063980 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="613f260f-a319-49b2-b978-4a446334b9b4" path="/var/lib/kubelet/pods/613f260f-a319-49b2-b978-4a446334b9b4/volumes" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.066033 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hqdv7/must-gather-gn9z7"] Feb 18 07:15:40 crc kubenswrapper[4707]: E0218 07:15:40.066808 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="extract-content" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.066823 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="extract-content" Feb 18 07:15:40 crc kubenswrapper[4707]: E0218 07:15:40.066842 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="extract-utilities" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.066848 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="extract-utilities" Feb 18 07:15:40 crc kubenswrapper[4707]: E0218 07:15:40.066864 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="registry-server" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.066870 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="registry-server" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.067061 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="613f260f-a319-49b2-b978-4a446334b9b4" containerName="registry-server" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.068207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.070234 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hqdv7"/"openshift-service-ca.crt" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.070359 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hqdv7"/"kube-root-ca.crt" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.082735 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hqdv7/must-gather-gn9z7"] Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.230725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57989\" (UniqueName: \"kubernetes.io/projected/ff999da3-3e5d-4483-81d7-721897325f90-kube-api-access-57989\") pod \"must-gather-gn9z7\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.231089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff999da3-3e5d-4483-81d7-721897325f90-must-gather-output\") pod \"must-gather-gn9z7\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.333371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57989\" (UniqueName: \"kubernetes.io/projected/ff999da3-3e5d-4483-81d7-721897325f90-kube-api-access-57989\") pod \"must-gather-gn9z7\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.333760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff999da3-3e5d-4483-81d7-721897325f90-must-gather-output\") pod \"must-gather-gn9z7\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.334118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff999da3-3e5d-4483-81d7-721897325f90-must-gather-output\") pod \"must-gather-gn9z7\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.362541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57989\" (UniqueName: \"kubernetes.io/projected/ff999da3-3e5d-4483-81d7-721897325f90-kube-api-access-57989\") pod \"must-gather-gn9z7\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.391039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:15:40 crc kubenswrapper[4707]: I0218 07:15:40.991550 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hqdv7/must-gather-gn9z7"] Feb 18 07:15:41 crc kubenswrapper[4707]: I0218 07:15:41.586228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" event={"ID":"ff999da3-3e5d-4483-81d7-721897325f90","Type":"ContainerStarted","Data":"b60446957220bf223cce9f544188f1acedb4224c6e95f7e3d46beaea45fb343b"} Feb 18 07:15:41 crc kubenswrapper[4707]: I0218 07:15:41.586820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" event={"ID":"ff999da3-3e5d-4483-81d7-721897325f90","Type":"ContainerStarted","Data":"dee44fba5532e5320af1c501dc94fe6dd710968646ba6a45e9d03fbb566744fd"} Feb 18 07:15:41 crc kubenswrapper[4707]: I0218 07:15:41.586838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" event={"ID":"ff999da3-3e5d-4483-81d7-721897325f90","Type":"ContainerStarted","Data":"321f23fec38d9657191ba0ad61867c31c721d4f836b21ff60086352aafe8182d"} Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.341740 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" podStartSLOduration=6.341717887 podStartE2EDuration="6.341717887s" podCreationTimestamp="2026-02-18 07:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 07:15:41.603341025 +0000 UTC m=+5278.251300159" watchObservedRunningTime="2026-02-18 07:15:46.341717887 +0000 UTC m=+5282.989677021" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.345752 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-fjkns"] Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.347056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.348895 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hqdv7"/"default-dockercfg-nw5bs" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.513992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7nz\" (UniqueName: \"kubernetes.io/projected/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-kube-api-access-6f7nz\") pod \"crc-debug-fjkns\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.514079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-host\") pod \"crc-debug-fjkns\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.615692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7nz\" (UniqueName: \"kubernetes.io/projected/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-kube-api-access-6f7nz\") pod \"crc-debug-fjkns\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.616194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-host\") pod \"crc-debug-fjkns\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.616326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-host\") pod \"crc-debug-fjkns\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.639016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7nz\" (UniqueName: \"kubernetes.io/projected/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-kube-api-access-6f7nz\") pod \"crc-debug-fjkns\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:46 crc kubenswrapper[4707]: I0218 07:15:46.666033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:15:47 crc kubenswrapper[4707]: I0218 07:15:47.637096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" event={"ID":"b2b53702-b7ce-4d53-8938-15ed5f5da5ad","Type":"ContainerStarted","Data":"8f4b5826dc5353e0c1379678752bbb82b2b64fec1130fc49ad0b18b05687c542"} Feb 18 07:15:47 crc kubenswrapper[4707]: I0218 07:15:47.637666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" event={"ID":"b2b53702-b7ce-4d53-8938-15ed5f5da5ad","Type":"ContainerStarted","Data":"978f2a89f89a06467de2be11e8c318574d0428b1a2028059a6b07472fca5c4c9"} Feb 18 07:15:47 crc kubenswrapper[4707]: I0218 07:15:47.650416 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" podStartSLOduration=1.650400178 podStartE2EDuration="1.650400178s" podCreationTimestamp="2026-02-18 07:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 07:15:47.649405622 +0000 UTC m=+5284.297364756" watchObservedRunningTime="2026-02-18 07:15:47.650400178 +0000 UTC m=+5284.298359312" Feb 18 07:15:57 crc kubenswrapper[4707]: I0218 07:15:57.257893 4707 scope.go:117] "RemoveContainer" containerID="e8de6d7eff3647717012ff73d8fd074e59d48307892d8b51655a73f09bf12963" Feb 18 07:16:26 crc kubenswrapper[4707]: I0218 07:16:26.733955 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2b53702-b7ce-4d53-8938-15ed5f5da5ad" containerID="8f4b5826dc5353e0c1379678752bbb82b2b64fec1130fc49ad0b18b05687c542" exitCode=0 Feb 18 07:16:26 crc kubenswrapper[4707]: I0218 07:16:26.734095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" event={"ID":"b2b53702-b7ce-4d53-8938-15ed5f5da5ad","Type":"ContainerDied","Data":"8f4b5826dc5353e0c1379678752bbb82b2b64fec1130fc49ad0b18b05687c542"} Feb 18 07:16:27 crc kubenswrapper[4707]: I0218 07:16:27.853847 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:16:27 crc kubenswrapper[4707]: I0218 07:16:27.889781 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-fjkns"] Feb 18 07:16:27 crc kubenswrapper[4707]: I0218 07:16:27.897607 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-fjkns"] Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.044160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-host\") pod \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.044277 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f7nz\" (UniqueName: \"kubernetes.io/projected/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-kube-api-access-6f7nz\") pod \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\" (UID: \"b2b53702-b7ce-4d53-8938-15ed5f5da5ad\") " Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.044318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-host" (OuterVolumeSpecName: "host") pod "b2b53702-b7ce-4d53-8938-15ed5f5da5ad" (UID: "b2b53702-b7ce-4d53-8938-15ed5f5da5ad"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.055252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-kube-api-access-6f7nz" (OuterVolumeSpecName: "kube-api-access-6f7nz") pod "b2b53702-b7ce-4d53-8938-15ed5f5da5ad" (UID: "b2b53702-b7ce-4d53-8938-15ed5f5da5ad"). InnerVolumeSpecName "kube-api-access-6f7nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.065668 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b53702-b7ce-4d53-8938-15ed5f5da5ad" path="/var/lib/kubelet/pods/b2b53702-b7ce-4d53-8938-15ed5f5da5ad/volumes" Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.146927 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f7nz\" (UniqueName: \"kubernetes.io/projected/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-kube-api-access-6f7nz\") on node \"crc\" DevicePath \"\"" Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.146960 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2b53702-b7ce-4d53-8938-15ed5f5da5ad-host\") on node \"crc\" DevicePath \"\"" Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.760475 4707 scope.go:117] "RemoveContainer" containerID="8f4b5826dc5353e0c1379678752bbb82b2b64fec1130fc49ad0b18b05687c542" Feb 18 07:16:28 crc kubenswrapper[4707]: I0218 07:16:28.760524 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fjkns" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.087770 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-fm98f"] Feb 18 07:16:29 crc kubenswrapper[4707]: E0218 07:16:29.088162 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b53702-b7ce-4d53-8938-15ed5f5da5ad" containerName="container-00" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.088174 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b53702-b7ce-4d53-8938-15ed5f5da5ad" containerName="container-00" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.088405 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b53702-b7ce-4d53-8938-15ed5f5da5ad" containerName="container-00" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.089042 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.091508 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hqdv7"/"default-dockercfg-nw5bs" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.268739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ba0f07-669d-4e2b-8a8d-2d0219616feb-host\") pod \"crc-debug-fm98f\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.268840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fkxj\" (UniqueName: \"kubernetes.io/projected/23ba0f07-669d-4e2b-8a8d-2d0219616feb-kube-api-access-5fkxj\") pod \"crc-debug-fm98f\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.370959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ba0f07-669d-4e2b-8a8d-2d0219616feb-host\") pod \"crc-debug-fm98f\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.371137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ba0f07-669d-4e2b-8a8d-2d0219616feb-host\") pod \"crc-debug-fm98f\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.371460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fkxj\" (UniqueName: \"kubernetes.io/projected/23ba0f07-669d-4e2b-8a8d-2d0219616feb-kube-api-access-5fkxj\") pod \"crc-debug-fm98f\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.397823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fkxj\" (UniqueName: \"kubernetes.io/projected/23ba0f07-669d-4e2b-8a8d-2d0219616feb-kube-api-access-5fkxj\") pod \"crc-debug-fm98f\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.411442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.771551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" event={"ID":"23ba0f07-669d-4e2b-8a8d-2d0219616feb","Type":"ContainerStarted","Data":"455e3b185dc665bfda1ad672dfa935aaf61a096ea80e107647a15fca0e382c1e"} Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.771965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" event={"ID":"23ba0f07-669d-4e2b-8a8d-2d0219616feb","Type":"ContainerStarted","Data":"37086326d46c6947d2a50ed42d10b93b0535bdec811e1ebf3aafd0d65254a83b"} Feb 18 07:16:29 crc kubenswrapper[4707]: I0218 07:16:29.794572 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" podStartSLOduration=0.794556302 podStartE2EDuration="794.556302ms" podCreationTimestamp="2026-02-18 07:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 07:16:29.786249787 +0000 UTC m=+5326.434208921" watchObservedRunningTime="2026-02-18 07:16:29.794556302 +0000 UTC m=+5326.442515426" Feb 18 07:16:30 crc kubenswrapper[4707]: I0218 07:16:30.783421 4707 generic.go:334] "Generic (PLEG): container finished" podID="23ba0f07-669d-4e2b-8a8d-2d0219616feb" containerID="455e3b185dc665bfda1ad672dfa935aaf61a096ea80e107647a15fca0e382c1e" exitCode=0 Feb 18 07:16:30 crc kubenswrapper[4707]: I0218 07:16:30.783750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" event={"ID":"23ba0f07-669d-4e2b-8a8d-2d0219616feb","Type":"ContainerDied","Data":"455e3b185dc665bfda1ad672dfa935aaf61a096ea80e107647a15fca0e382c1e"} Feb 18 07:16:31 crc kubenswrapper[4707]: I0218 07:16:31.906926 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.015493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ba0f07-669d-4e2b-8a8d-2d0219616feb-host\") pod \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.015595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23ba0f07-669d-4e2b-8a8d-2d0219616feb-host" (OuterVolumeSpecName: "host") pod "23ba0f07-669d-4e2b-8a8d-2d0219616feb" (UID: "23ba0f07-669d-4e2b-8a8d-2d0219616feb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.015667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fkxj\" (UniqueName: \"kubernetes.io/projected/23ba0f07-669d-4e2b-8a8d-2d0219616feb-kube-api-access-5fkxj\") pod \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\" (UID: \"23ba0f07-669d-4e2b-8a8d-2d0219616feb\") " Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.016148 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ba0f07-669d-4e2b-8a8d-2d0219616feb-host\") on node \"crc\" DevicePath \"\"" Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.022029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ba0f07-669d-4e2b-8a8d-2d0219616feb-kube-api-access-5fkxj" (OuterVolumeSpecName: "kube-api-access-5fkxj") pod "23ba0f07-669d-4e2b-8a8d-2d0219616feb" (UID: "23ba0f07-669d-4e2b-8a8d-2d0219616feb"). InnerVolumeSpecName "kube-api-access-5fkxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.117771 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fkxj\" (UniqueName: \"kubernetes.io/projected/23ba0f07-669d-4e2b-8a8d-2d0219616feb-kube-api-access-5fkxj\") on node \"crc\" DevicePath \"\"" Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.493562 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-fm98f"] Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.501993 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-fm98f"] Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.804985 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37086326d46c6947d2a50ed42d10b93b0535bdec811e1ebf3aafd0d65254a83b" Feb 18 07:16:32 crc kubenswrapper[4707]: I0218 07:16:32.805045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-fm98f" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.656662 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-h5jmg"] Feb 18 07:16:33 crc kubenswrapper[4707]: E0218 07:16:33.658813 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ba0f07-669d-4e2b-8a8d-2d0219616feb" containerName="container-00" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.659349 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ba0f07-669d-4e2b-8a8d-2d0219616feb" containerName="container-00" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.660044 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ba0f07-669d-4e2b-8a8d-2d0219616feb" containerName="container-00" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.660901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.663792 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hqdv7"/"default-dockercfg-nw5bs" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.761503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzczg\" (UniqueName: \"kubernetes.io/projected/ec745b7b-f620-45c6-9b0b-613c7d299f21-kube-api-access-dzczg\") pod \"crc-debug-h5jmg\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.761588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec745b7b-f620-45c6-9b0b-613c7d299f21-host\") pod \"crc-debug-h5jmg\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.863383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzczg\" (UniqueName: \"kubernetes.io/projected/ec745b7b-f620-45c6-9b0b-613c7d299f21-kube-api-access-dzczg\") pod \"crc-debug-h5jmg\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.863488 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec745b7b-f620-45c6-9b0b-613c7d299f21-host\") pod \"crc-debug-h5jmg\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.863700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec745b7b-f620-45c6-9b0b-613c7d299f21-host\") pod \"crc-debug-h5jmg\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.884189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzczg\" (UniqueName: \"kubernetes.io/projected/ec745b7b-f620-45c6-9b0b-613c7d299f21-kube-api-access-dzczg\") pod \"crc-debug-h5jmg\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:33 crc kubenswrapper[4707]: I0218 07:16:33.990004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:34 crc kubenswrapper[4707]: W0218 07:16:34.018289 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec745b7b_f620_45c6_9b0b_613c7d299f21.slice/crio-0baf76389df91d067e4e9935a9ca981900668e9d1f0b061566966f9328d69135 WatchSource:0}: Error finding container 0baf76389df91d067e4e9935a9ca981900668e9d1f0b061566966f9328d69135: Status 404 returned error can't find the container with id 0baf76389df91d067e4e9935a9ca981900668e9d1f0b061566966f9328d69135 Feb 18 07:16:34 crc kubenswrapper[4707]: I0218 07:16:34.065779 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ba0f07-669d-4e2b-8a8d-2d0219616feb" path="/var/lib/kubelet/pods/23ba0f07-669d-4e2b-8a8d-2d0219616feb/volumes" Feb 18 07:16:34 crc kubenswrapper[4707]: I0218 07:16:34.833076 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec745b7b-f620-45c6-9b0b-613c7d299f21" containerID="d6dc7c28ea048d3d0bfbd74c5e36646de81fa1aeda549652b1eb9be99e374486" exitCode=0 Feb 18 07:16:34 crc kubenswrapper[4707]: I0218 07:16:34.833344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" event={"ID":"ec745b7b-f620-45c6-9b0b-613c7d299f21","Type":"ContainerDied","Data":"d6dc7c28ea048d3d0bfbd74c5e36646de81fa1aeda549652b1eb9be99e374486"} Feb 18 07:16:34 crc kubenswrapper[4707]: I0218 07:16:34.833380 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" event={"ID":"ec745b7b-f620-45c6-9b0b-613c7d299f21","Type":"ContainerStarted","Data":"0baf76389df91d067e4e9935a9ca981900668e9d1f0b061566966f9328d69135"} Feb 18 07:16:34 crc kubenswrapper[4707]: I0218 07:16:34.877689 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-h5jmg"] Feb 18 07:16:34 crc kubenswrapper[4707]: I0218 07:16:34.886854 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hqdv7/crc-debug-h5jmg"] Feb 18 07:16:35 crc kubenswrapper[4707]: I0218 07:16:35.932030 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.101181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzczg\" (UniqueName: \"kubernetes.io/projected/ec745b7b-f620-45c6-9b0b-613c7d299f21-kube-api-access-dzczg\") pod \"ec745b7b-f620-45c6-9b0b-613c7d299f21\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.101614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec745b7b-f620-45c6-9b0b-613c7d299f21-host\") pod \"ec745b7b-f620-45c6-9b0b-613c7d299f21\" (UID: \"ec745b7b-f620-45c6-9b0b-613c7d299f21\") " Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.101685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec745b7b-f620-45c6-9b0b-613c7d299f21-host" (OuterVolumeSpecName: "host") pod "ec745b7b-f620-45c6-9b0b-613c7d299f21" (UID: "ec745b7b-f620-45c6-9b0b-613c7d299f21"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.102559 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec745b7b-f620-45c6-9b0b-613c7d299f21-host\") on node \"crc\" DevicePath \"\"" Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.107069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec745b7b-f620-45c6-9b0b-613c7d299f21-kube-api-access-dzczg" (OuterVolumeSpecName: "kube-api-access-dzczg") pod "ec745b7b-f620-45c6-9b0b-613c7d299f21" (UID: "ec745b7b-f620-45c6-9b0b-613c7d299f21"). InnerVolumeSpecName "kube-api-access-dzczg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.206310 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzczg\" (UniqueName: \"kubernetes.io/projected/ec745b7b-f620-45c6-9b0b-613c7d299f21-kube-api-access-dzczg\") on node \"crc\" DevicePath \"\"" Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.849905 4707 scope.go:117] "RemoveContainer" containerID="d6dc7c28ea048d3d0bfbd74c5e36646de81fa1aeda549652b1eb9be99e374486" Feb 18 07:16:36 crc kubenswrapper[4707]: I0218 07:16:36.849963 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/crc-debug-h5jmg" Feb 18 07:16:38 crc kubenswrapper[4707]: I0218 07:16:38.063177 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec745b7b-f620-45c6-9b0b-613c7d299f21" path="/var/lib/kubelet/pods/ec745b7b-f620-45c6-9b0b-613c7d299f21/volumes" Feb 18 07:16:59 crc kubenswrapper[4707]: I0218 07:16:59.785245 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-fc8b85554-bcs7j_13bd62ec-d5ea-4ad3-8020-0cc244072675/barbican-api/0.log" Feb 18 07:17:00 crc kubenswrapper[4707]: I0218 07:17:00.117301 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-fc8b85554-bcs7j_13bd62ec-d5ea-4ad3-8020-0cc244072675/barbican-api-log/0.log" Feb 18 07:17:00 crc kubenswrapper[4707]: I0218 07:17:00.174770 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-747894f44d-cbjjh_b16501ed-460b-4a5c-8d59-9acddd5e1011/barbican-keystone-listener/0.log" Feb 18 07:17:00 crc kubenswrapper[4707]: I0218 07:17:00.409425 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-667ddf5c59-6bpbq_06b50daa-d6b3-4865-b224-516392956313/barbican-worker/0.log" Feb 18 07:17:00 crc kubenswrapper[4707]: I0218 07:17:00.498852 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-667ddf5c59-6bpbq_06b50daa-d6b3-4865-b224-516392956313/barbican-worker-log/0.log" Feb 18 07:17:00 crc kubenswrapper[4707]: I0218 07:17:00.731830 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-24v68_3c7dd778-4759-4515-bbf0-bbc5123e822f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:00 crc kubenswrapper[4707]: I0218 07:17:00.917235 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-747894f44d-cbjjh_b16501ed-460b-4a5c-8d59-9acddd5e1011/barbican-keystone-listener-log/0.log" Feb 18 07:17:00 crc kubenswrapper[4707]: I0218 07:17:00.971745 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/ceilometer-central-agent/0.log" Feb 18 07:17:01 crc kubenswrapper[4707]: I0218 07:17:01.005997 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/proxy-httpd/0.log" Feb 18 07:17:01 crc kubenswrapper[4707]: I0218 07:17:01.085322 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/ceilometer-notification-agent/0.log" Feb 18 07:17:01 crc kubenswrapper[4707]: I0218 07:17:01.140259 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a7cd13b0-5afc-4b19-a14e-bd4ba98fd78b/sg-core/0.log" Feb 18 07:17:01 crc kubenswrapper[4707]: I0218 07:17:01.403164 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_1b24c097-807d-43e6-aaa5-b9abfb48bff5/ceph/0.log" Feb 18 07:17:01 crc kubenswrapper[4707]: I0218 07:17:01.601839 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6f03e391-db4f-46dd-b206-94e9f6d65e68/cinder-api/0.log" Feb 18 07:17:01 crc kubenswrapper[4707]: I0218 07:17:01.621890 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6f03e391-db4f-46dd-b206-94e9f6d65e68/cinder-api-log/0.log" Feb 18 07:17:02 crc kubenswrapper[4707]: I0218 07:17:02.083239 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a75b087d-214f-4fed-a30c-d0d4f5607a08/probe/0.log" Feb 18 07:17:02 crc kubenswrapper[4707]: I0218 07:17:02.165537 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e24e699-659c-4701-9459-133197b510d7/cinder-scheduler/0.log" Feb 18 07:17:02 crc kubenswrapper[4707]: I0218 07:17:02.422618 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2e24e699-659c-4701-9459-133197b510d7/probe/0.log" Feb 18 07:17:02 crc kubenswrapper[4707]: I0218 07:17:02.597921 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a75b087d-214f-4fed-a30c-d0d4f5607a08/cinder-backup/0.log" Feb 18 07:17:03 crc kubenswrapper[4707]: I0218 07:17:03.042779 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_62efca2e-66ee-443d-910e-eb9c22f0536f/probe/0.log" Feb 18 07:17:03 crc kubenswrapper[4707]: I0218 07:17:03.119832 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-557tm_18d9274e-1766-4a10-9522-568030d5db64/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:03 crc kubenswrapper[4707]: I0218 07:17:03.427160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-grlvs_beb9134a-dfca-4e8d-be56-0e0980d32bc8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:03 crc kubenswrapper[4707]: I0218 07:17:03.857640 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d474c7589-z56p2_0341405c-1d6a-4750-b7c5-07ae9825d4b6/init/0.log" Feb 18 07:17:03 crc kubenswrapper[4707]: I0218 07:17:03.946460 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_62efca2e-66ee-443d-910e-eb9c22f0536f/cinder-volume/0.log" Feb 18 07:17:04 crc kubenswrapper[4707]: I0218 07:17:04.042155 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d474c7589-z56p2_0341405c-1d6a-4750-b7c5-07ae9825d4b6/init/0.log" Feb 18 07:17:04 crc kubenswrapper[4707]: I0218 07:17:04.189607 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d474c7589-z56p2_0341405c-1d6a-4750-b7c5-07ae9825d4b6/dnsmasq-dns/0.log" Feb 18 07:17:04 crc kubenswrapper[4707]: I0218 07:17:04.766411 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xtr45_d7bc2edd-9db2-40df-be54-0db1c1b462fa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:04 crc kubenswrapper[4707]: I0218 07:17:04.949052 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b/glance-httpd/0.log" Feb 18 07:17:04 crc kubenswrapper[4707]: I0218 07:17:04.986295 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e10650b3-4f5f-4a5f-a68f-9dcfffb83a2b/glance-log/0.log" Feb 18 07:17:05 crc kubenswrapper[4707]: I0218 07:17:05.029517 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_36b54f53-a798-4b8c-99ab-773ba732530b/glance-log/0.log" Feb 18 07:17:05 crc kubenswrapper[4707]: I0218 07:17:05.098069 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_36b54f53-a798-4b8c-99ab-773ba732530b/glance-httpd/0.log" Feb 18 07:17:05 crc kubenswrapper[4707]: I0218 07:17:05.336877 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77db99878b-h8xzs_6aa9efa8-e6b5-4307-89b1-8a67547a35e9/horizon/0.log" Feb 18 07:17:05 crc kubenswrapper[4707]: I0218 07:17:05.425540 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-rvws6_89235767-8eea-43b0-9b2e-cf7fc766a260/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:05 crc kubenswrapper[4707]: I0218 07:17:05.598108 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-45gf9_e3939063-5ede-47de-8c02-a46756c148b5/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:05 crc kubenswrapper[4707]: I0218 07:17:05.842537 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77db99878b-h8xzs_6aa9efa8-e6b5-4307-89b1-8a67547a35e9/horizon-log/0.log" Feb 18 07:17:05 crc kubenswrapper[4707]: I0218 07:17:05.933557 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29523301-k4v49_8fcf8387-c297-4bb6-acc4-810bb4fab9e5/keystone-cron/0.log" Feb 18 07:17:06 crc kubenswrapper[4707]: I0218 07:17:06.099346 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e2c446d7-5c5f-40e6-831d-4c3e6c75d13d/kube-state-metrics/0.log" Feb 18 07:17:06 crc kubenswrapper[4707]: I0218 07:17:06.218262 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cf7sv_603500de-24c1-4ef6-a13a-24646a085b58/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:06 crc kubenswrapper[4707]: I0218 07:17:06.707944 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_04bdc22d-7e6e-428b-849a-45c041654404/probe/0.log" Feb 18 07:17:07 crc kubenswrapper[4707]: I0218 07:17:07.179099 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_79366b7f-24dd-4217-b2af-7350751ce6d3/manila-api/0.log" Feb 18 07:17:07 crc kubenswrapper[4707]: I0218 07:17:07.188524 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_04bdc22d-7e6e-428b-849a-45c041654404/manila-scheduler/0.log" Feb 18 07:17:07 crc kubenswrapper[4707]: I0218 07:17:07.437027 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b/probe/0.log" Feb 18 07:17:07 crc kubenswrapper[4707]: I0218 07:17:07.678064 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_79366b7f-24dd-4217-b2af-7350751ce6d3/manila-api-log/0.log" Feb 18 07:17:07 crc kubenswrapper[4707]: I0218 07:17:07.940424 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1e5f9c3d-3ac5-470a-9fdb-ba7f6425d17b/manila-share/0.log" Feb 18 07:17:08 crc kubenswrapper[4707]: I0218 07:17:08.467148 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qzcjh_cf9b64ea-e740-4b80-b899-5f856afdd9c7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:08 crc kubenswrapper[4707]: I0218 07:17:08.834475 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6998659dbc-vmh65_0cb3d300-55b7-4aea-b732-2ab9a36ace83/neutron-httpd/0.log" Feb 18 07:17:09 crc kubenswrapper[4707]: I0218 07:17:09.912862 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6998659dbc-vmh65_0cb3d300-55b7-4aea-b732-2ab9a36ace83/neutron-api/0.log" Feb 18 07:17:11 crc kubenswrapper[4707]: I0218 07:17:11.478448 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6cc5d5b844-m7q6c_d9158ecc-f6e5-4c3f-a7e8-9195a34648b3/keystone-api/0.log" Feb 18 07:17:11 crc kubenswrapper[4707]: I0218 07:17:11.647177 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e596f7ea-65d0-41e8-8469-bf3aace5ed9a/nova-cell0-conductor-conductor/0.log" Feb 18 07:17:13 crc kubenswrapper[4707]: I0218 07:17:13.308633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_95d9aeec-f182-49b1-9064-352e3bd2fe9b/nova-cell1-conductor-conductor/0.log" Feb 18 07:17:13 crc kubenswrapper[4707]: I0218 07:17:13.563477 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ff7a9ac0-e9b0-4497-ad55-18768ff36da1/nova-api-log/0.log" Feb 18 07:17:13 crc kubenswrapper[4707]: I0218 07:17:13.680623 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_891ed851-3533-43e4-a60b-791e4ebd0afa/nova-cell1-novncproxy-novncproxy/0.log" Feb 18 07:17:14 crc kubenswrapper[4707]: I0218 07:17:14.358713 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dck9h_4f8eff2f-2ca1-4fe4-8138-333c62468b97/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:14 crc kubenswrapper[4707]: I0218 07:17:14.491075 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ca2ba934-fce0-4bc1-af4e-27d758f7aef6/nova-metadata-log/0.log" Feb 18 07:17:15 crc kubenswrapper[4707]: I0218 07:17:15.104015 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_28c5a172-7c7d-407a-b727-0f982f82680c/mysql-bootstrap/0.log" Feb 18 07:17:15 crc kubenswrapper[4707]: I0218 07:17:15.197186 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_cda1514b-6a18-4c59-8d92-4168f4dc589f/nova-scheduler-scheduler/0.log" Feb 18 07:17:15 crc kubenswrapper[4707]: I0218 07:17:15.334568 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ff7a9ac0-e9b0-4497-ad55-18768ff36da1/nova-api-api/0.log" Feb 18 07:17:15 crc kubenswrapper[4707]: I0218 07:17:15.380147 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_28c5a172-7c7d-407a-b727-0f982f82680c/mysql-bootstrap/0.log" Feb 18 07:17:15 crc kubenswrapper[4707]: I0218 07:17:15.398081 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_28c5a172-7c7d-407a-b727-0f982f82680c/galera/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.127325 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ee6297b-9af9-40fd-90e0-edcb0c08f6e8/mysql-bootstrap/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.389730 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ee6297b-9af9-40fd-90e0-edcb0c08f6e8/galera/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.397113 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ee6297b-9af9-40fd-90e0-edcb0c08f6e8/mysql-bootstrap/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.607702 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_73dac699-5199-47dc-b173-8df7813c1ad4/openstackclient/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.634096 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-297xq_ea5baf83-32e6-41ec-b14a-d32b3f848be6/ovn-controller/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.685498 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ca2ba934-fce0-4bc1-af4e-27d758f7aef6/nova-metadata-metadata/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.911305 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-m2jp7_866d1055-f899-4a65-a353-366bf3a303bf/openstack-network-exporter/0.log" Feb 18 07:17:16 crc kubenswrapper[4707]: I0218 07:17:16.978225 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovsdb-server-init/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.195505 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovsdb-server-init/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.201182 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovs-vswitchd/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.213407 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f95ql_45dd27c5-0315-416d-99cc-197009aa5a8f/ovsdb-server/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.431389 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xvlwh_09e0f47e-9057-4b18-ba9a-41b34b4fe425/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.507818 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_250e525d-abb2-4374-89d4-3b16602fc351/openstack-network-exporter/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.515531 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_250e525d-abb2-4374-89d4-3b16602fc351/ovn-northd/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.686938 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_27a6797f-d647-4727-ac53-df0b6d7495ca/ovsdbserver-nb/0.log" Feb 18 07:17:17 crc kubenswrapper[4707]: I0218 07:17:17.769067 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_27a6797f-d647-4727-ac53-df0b6d7495ca/openstack-network-exporter/0.log" Feb 18 07:17:18 crc kubenswrapper[4707]: I0218 07:17:18.087727 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c2df84df-113d-42d3-b7e7-ee6d01888dd9/ovsdbserver-sb/0.log" Feb 18 07:17:18 crc kubenswrapper[4707]: I0218 07:17:18.125907 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c2df84df-113d-42d3-b7e7-ee6d01888dd9/openstack-network-exporter/0.log" Feb 18 07:17:18 crc kubenswrapper[4707]: I0218 07:17:18.448665 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d/setup-container/0.log" Feb 18 07:17:18 crc kubenswrapper[4707]: I0218 07:17:18.623040 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-986f6fbf8-z89c7_9d4c03f6-2a0f-460b-9b68-50838289b469/placement-api/0.log" Feb 18 07:17:18 crc kubenswrapper[4707]: I0218 07:17:18.734399 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d/setup-container/0.log" Feb 18 07:17:18 crc kubenswrapper[4707]: I0218 07:17:18.772597 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7ba5a1aa-0d87-48c8-8258-2eff19d8ae1d/rabbitmq/0.log" Feb 18 07:17:18 crc kubenswrapper[4707]: I0218 07:17:18.930929 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-986f6fbf8-z89c7_9d4c03f6-2a0f-460b-9b68-50838289b469/placement-log/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.024880 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b14ae66-3d41-476b-9ca7-2490e36de0aa/setup-container/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.148658 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b14ae66-3d41-476b-9ca7-2490e36de0aa/setup-container/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.225656 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b14ae66-3d41-476b-9ca7-2490e36de0aa/rabbitmq/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.232488 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f66x8_e81fc37d-6fb1-4a43-b632-cec42f602002/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.462312 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lnn9t_4ccb192f-200d-453b-8829-3cdaddb0987b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.502464 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4hc8q_668c00e7-edea-47b0-a904-961fb756cb1d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.716313 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-w7kvz_72503a5f-0b97-4eee-b0d1-7f9621b6917c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:19 crc kubenswrapper[4707]: I0218 07:17:19.938404 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5rj9p_eda30b1a-96f0-425e-908d-4846ffe8c3bb/ssh-known-hosts-edpm-deployment/0.log" Feb 18 07:17:20 crc kubenswrapper[4707]: I0218 07:17:20.160845 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-878756b99-xx5vn_a6b4c749-b753-42b9-8bc7-fb25121f0ea8/proxy-server/0.log" Feb 18 07:17:20 crc kubenswrapper[4707]: I0218 07:17:20.225160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-878756b99-xx5vn_a6b4c749-b753-42b9-8bc7-fb25121f0ea8/proxy-httpd/0.log" Feb 18 07:17:20 crc kubenswrapper[4707]: I0218 07:17:20.512859 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rbttv_4ff550e2-53ae-4f38-98d1-e95da8f7bde6/swift-ring-rebalance/0.log" Feb 18 07:17:20 crc kubenswrapper[4707]: I0218 07:17:20.666306 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-reaper/0.log" Feb 18 07:17:20 crc kubenswrapper[4707]: I0218 07:17:20.723633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-auditor/0.log" Feb 18 07:17:20 crc kubenswrapper[4707]: I0218 07:17:20.747047 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-replicator/0.log" Feb 18 07:17:20 crc kubenswrapper[4707]: I0218 07:17:20.876486 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/account-server/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.382212 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.382253 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.433035 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-auditor/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.433514 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-server/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.481559 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-replicator/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.576756 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/container-updater/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.667769 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-expirer/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.705458 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-auditor/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.755374 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-replicator/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.897127 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-server/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.950601 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/object-updater/0.log" Feb 18 07:17:21 crc kubenswrapper[4707]: I0218 07:17:21.954340 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/rsync/0.log" Feb 18 07:17:22 crc kubenswrapper[4707]: I0218 07:17:22.019571 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5253fac6-1dd5-48c7-853a-f7cfa41840fa/swift-recon-cron/0.log" Feb 18 07:17:22 crc kubenswrapper[4707]: I0218 07:17:22.330385 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bbq8l_6afe228a-638b-41a3-ba74-556fbc740148/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:22 crc kubenswrapper[4707]: I0218 07:17:22.441188 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e369f41f-534e-48ee-bdcb-da26b742cfc3/tempest-tests-tempest-tests-runner/0.log" Feb 18 07:17:22 crc kubenswrapper[4707]: I0218 07:17:22.462409 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3c04cb22-866a-403f-9a07-1a12cfd909e2/test-operator-logs-container/0.log" Feb 18 07:17:22 crc kubenswrapper[4707]: I0218 07:17:22.682892 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wvhzl_6cfdc829-6a01-4b1b-b774-5b7a0ff96d68/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 07:17:32 crc kubenswrapper[4707]: I0218 07:17:32.277358 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4a3a1b52-c364-480e-a60b-8bc313f3002d/memcached/0.log" Feb 18 07:17:48 crc kubenswrapper[4707]: I0218 07:17:48.352574 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/util/0.log" Feb 18 07:17:48 crc kubenswrapper[4707]: I0218 07:17:48.510524 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/pull/0.log" Feb 18 07:17:48 crc kubenswrapper[4707]: I0218 07:17:48.564845 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/util/0.log" Feb 18 07:17:48 crc kubenswrapper[4707]: I0218 07:17:48.582920 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/pull/0.log" Feb 18 07:17:48 crc kubenswrapper[4707]: I0218 07:17:48.868268 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/util/0.log" Feb 18 07:17:48 crc kubenswrapper[4707]: I0218 07:17:48.876872 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/pull/0.log" Feb 18 07:17:48 crc kubenswrapper[4707]: I0218 07:17:48.886582 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c7a89cc7e0c1448af9bdc6569e350a100e1c9490ebb439c83fb771103fmrx76_4a393b6f-ea10-4977-827a-be170d705fff/extract/0.log" Feb 18 07:17:49 crc kubenswrapper[4707]: I0218 07:17:49.288421 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-6lrh7_bc6f5234-aab6-43ea-89e1-a3f785742a89/manager/0.log" Feb 18 07:17:49 crc kubenswrapper[4707]: I0218 07:17:49.633323 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-fnz67_8f61ada5-7374-4801-89b2-c95aec2e52ab/manager/0.log" Feb 18 07:17:49 crc kubenswrapper[4707]: I0218 07:17:49.800486 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-244nk_6d9f6300-cce0-4cb2-8f7d-eb3fd22f5742/manager/0.log" Feb 18 07:17:50 crc kubenswrapper[4707]: I0218 07:17:50.016286 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-85hj5_1a236879-9c6a-4604-b5bc-024b7dfd5161/manager/0.log" Feb 18 07:17:50 crc kubenswrapper[4707]: I0218 07:17:50.524891 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-j5dft_93d80e73-44d0-4db8-8a43-ee2cc8b7e399/manager/0.log" Feb 18 07:17:50 crc kubenswrapper[4707]: I0218 07:17:50.703668 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-8gngt_8ed2f5cf-84b8-4a09-b76f-a60bcb055a04/manager/0.log" Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.070855 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-6f96w_274d7d14-4ef9-47b8-8a2e-07e7a2bb9850/manager/0.log" Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.252206 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-c5s9h_dc8762c9-27f5-476e-840f-815aa3736e85/manager/0.log" Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.381727 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.381804 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.473584 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-v6v5m_4c759f5c-da54-44e9-8dec-5f2622419af9/manager/0.log" Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.561872 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-rnhbn_4fcd0bf8-cf6a-45c0-862b-5554daa34c21/manager/0.log" Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.720880 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-2hv82_5442f037-ff83-40b8-9c3f-c73c227effde/manager/0.log" Feb 18 07:17:51 crc kubenswrapper[4707]: I0218 07:17:51.891449 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-twhwz_dc38a034-90cc-4976-93dd-ae54d298b574/manager/0.log" Feb 18 07:17:52 crc kubenswrapper[4707]: I0218 07:17:52.165550 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c86ffd_8078f629-a80e-4f59-b84a-33144cc5b0c6/manager/0.log" Feb 18 07:17:52 crc kubenswrapper[4707]: I0218 07:17:52.567361 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-766dc4fc6-q9dtp_b70a612f-7e0b-4187-82b0-404c913ce3d4/operator/0.log" Feb 18 07:17:52 crc kubenswrapper[4707]: I0218 07:17:52.800394 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nnvqj_ccede39b-e3bf-4e86-9b9e-bbdc1b13a349/registry-server/0.log" Feb 18 07:17:53 crc kubenswrapper[4707]: I0218 07:17:53.018386 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-9bb9z_1332f158-2c06-4a3d-9ca9-2dc667c471ba/manager/0.log" Feb 18 07:17:53 crc kubenswrapper[4707]: I0218 07:17:53.276682 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-48k7j_398bbd80-3377-4b8e-b9cd-bdb3a76167ca/manager/0.log" Feb 18 07:17:53 crc kubenswrapper[4707]: I0218 07:17:53.519972 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4qmcg_97e7c996-241f-4732-9e68-a371d114f664/operator/0.log" Feb 18 07:17:53 crc kubenswrapper[4707]: I0218 07:17:53.758255 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-tj2tj_78a96912-db1a-42b8-80aa-7800f28fb0c2/manager/0.log" Feb 18 07:17:54 crc kubenswrapper[4707]: I0218 07:17:54.069222 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-ln7bk_7f2692c0-280b-4449-ac2d-6a9da6eafebe/manager/0.log" Feb 18 07:17:54 crc kubenswrapper[4707]: I0218 07:17:54.231207 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-2dp4d_d4364aee-09c0-49d9-8f50-60e48ecb7d08/manager/0.log" Feb 18 07:17:54 crc kubenswrapper[4707]: I0218 07:17:54.453429 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-xbl8j_77cee8d8-c1d5-4743-a6c0-478b7c16e991/manager/0.log" Feb 18 07:17:54 crc kubenswrapper[4707]: I0218 07:17:54.586046 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-dccc9b448-chjxf_c8f9f5f4-3cdb-4b04-bc52-26acb4dda227/manager/0.log" Feb 18 07:17:54 crc kubenswrapper[4707]: I0218 07:17:54.880863 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-4d59f_890576c4-79c6-40dc-b786-0fb2055a1a3e/manager/0.log" Feb 18 07:18:00 crc kubenswrapper[4707]: I0218 07:18:00.064786 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-qj27r_67bdd3cc-ee7d-4e79-8568-75502788aa1d/manager/0.log" Feb 18 07:18:14 crc kubenswrapper[4707]: I0218 07:18:14.025435 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tnwns_e09b1e8f-752e-42dc-a638-cc7ac7179f83/control-plane-machine-set-operator/0.log" Feb 18 07:18:14 crc kubenswrapper[4707]: I0218 07:18:14.277455 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g4hvr_b7a4eced-46b2-4002-964d-490b0ad2acd3/kube-rbac-proxy/0.log" Feb 18 07:18:14 crc kubenswrapper[4707]: I0218 07:18:14.320771 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g4hvr_b7a4eced-46b2-4002-964d-490b0ad2acd3/machine-api-operator/0.log" Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.382673 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.384373 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.384499 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.385306 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"285a82f9ab065dc06e8aad4a395d2035da5abd7059a2b1dc1270dfccd3ef7a66"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.385499 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://285a82f9ab065dc06e8aad4a395d2035da5abd7059a2b1dc1270dfccd3ef7a66" gracePeriod=600 Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.975954 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="285a82f9ab065dc06e8aad4a395d2035da5abd7059a2b1dc1270dfccd3ef7a66" exitCode=0 Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.976035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"285a82f9ab065dc06e8aad4a395d2035da5abd7059a2b1dc1270dfccd3ef7a66"} Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.976583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88"} Feb 18 07:18:21 crc kubenswrapper[4707]: I0218 07:18:21.976607 4707 scope.go:117] "RemoveContainer" containerID="b4272004f371b222ad9f76a67629ce85475c8c1fa3a14b290bcc6643103b2af8" Feb 18 07:18:25 crc kubenswrapper[4707]: I0218 07:18:25.933310 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fvczt_ca9b8372-6acf-4c51-8eaf-a0f0195ed4e0/cert-manager-controller/0.log" Feb 18 07:18:26 crc kubenswrapper[4707]: I0218 07:18:26.142462 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-hr4ts_e313d623-e3cf-4c4a-a0d5-aeaf4db44a3f/cert-manager-cainjector/0.log" Feb 18 07:18:26 crc kubenswrapper[4707]: I0218 07:18:26.208246 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dgt7d_07e55359-ea37-4736-a571-315823908633/cert-manager-webhook/0.log" Feb 18 07:18:37 crc kubenswrapper[4707]: I0218 07:18:37.852744 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-8xc9q_fd51f090-835b-4d6f-9204-1564b2430039/nmstate-console-plugin/0.log" Feb 18 07:18:38 crc kubenswrapper[4707]: I0218 07:18:38.021588 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dfp8q_7c0f897b-dd2e-4356-ae9d-a85bae401266/nmstate-handler/0.log" Feb 18 07:18:38 crc kubenswrapper[4707]: I0218 07:18:38.105688 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qmfqj_07ea2b2b-697c-491e-89fe-707d7a2f6a32/kube-rbac-proxy/0.log" Feb 18 07:18:38 crc kubenswrapper[4707]: I0218 07:18:38.136935 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qmfqj_07ea2b2b-697c-491e-89fe-707d7a2f6a32/nmstate-metrics/0.log" Feb 18 07:18:38 crc kubenswrapper[4707]: I0218 07:18:38.246687 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-wjprs_69e29cb2-d836-4c53-81e6-1d387d6202b9/nmstate-operator/0.log" Feb 18 07:18:38 crc kubenswrapper[4707]: I0218 07:18:38.354574 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-cn2n8_ff6bca46-e4fa-443f-82c7-7995a2b6499b/nmstate-webhook/0.log" Feb 18 07:19:03 crc kubenswrapper[4707]: I0218 07:19:03.512302 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-4gf26_9027d820-8aca-4e3b-84f0-4b81be566548/kube-rbac-proxy/0.log" Feb 18 07:19:03 crc kubenswrapper[4707]: I0218 07:19:03.714160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:19:03 crc kubenswrapper[4707]: I0218 07:19:03.716686 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-4gf26_9027d820-8aca-4e3b-84f0-4b81be566548/controller/0.log" Feb 18 07:19:03 crc kubenswrapper[4707]: I0218 07:19:03.978227 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:19:03 crc kubenswrapper[4707]: I0218 07:19:03.993446 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.000627 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.059842 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.177024 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.209178 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.260741 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.286835 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.495461 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-metrics/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.527023 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-frr-files/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.532020 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/cp-reloader/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.559769 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/controller/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.726771 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/kube-rbac-proxy/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.727599 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/frr-metrics/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.788268 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/kube-rbac-proxy-frr/0.log" Feb 18 07:19:04 crc kubenswrapper[4707]: I0218 07:19:04.951301 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/reloader/0.log" Feb 18 07:19:05 crc kubenswrapper[4707]: I0218 07:19:05.060635 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-kt4g9_a3bf8bf8-cafc-49e2-b284-33d016f8bb50/frr-k8s-webhook-server/0.log" Feb 18 07:19:05 crc kubenswrapper[4707]: I0218 07:19:05.226240 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7cd6fc9664-wtj2x_a755d713-37ff-463f-81d6-aa0bfc05c654/manager/0.log" Feb 18 07:19:05 crc kubenswrapper[4707]: I0218 07:19:05.423287 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79944b854-l7jrs_44e90078-03e0-4691-ba27-cbd9c5ab9cbe/webhook-server/0.log" Feb 18 07:19:05 crc kubenswrapper[4707]: I0218 07:19:05.582310 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lh4s7_c3112677-66f0-45d3-9281-094cd5c11163/kube-rbac-proxy/0.log" Feb 18 07:19:06 crc kubenswrapper[4707]: I0218 07:19:06.226851 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lh4s7_c3112677-66f0-45d3-9281-094cd5c11163/speaker/0.log" Feb 18 07:19:06 crc kubenswrapper[4707]: I0218 07:19:06.518651 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5z4hx_7260955d-03d4-4758-8159-b7c648865b62/frr/0.log" Feb 18 07:19:18 crc kubenswrapper[4707]: I0218 07:19:18.416814 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/util/0.log" Feb 18 07:19:18 crc kubenswrapper[4707]: I0218 07:19:18.594119 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/pull/0.log" Feb 18 07:19:18 crc kubenswrapper[4707]: I0218 07:19:18.621734 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/util/0.log" Feb 18 07:19:18 crc kubenswrapper[4707]: I0218 07:19:18.659121 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/pull/0.log" Feb 18 07:19:18 crc kubenswrapper[4707]: I0218 07:19:18.834839 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/pull/0.log" Feb 18 07:19:18 crc kubenswrapper[4707]: I0218 07:19:18.840486 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/extract/0.log" Feb 18 07:19:18 crc kubenswrapper[4707]: I0218 07:19:18.859598 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213szl85_cc452865-bae4-4e28-ac49-ecc5bdd1a5c2/util/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.032468 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-utilities/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.174222 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-content/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.197422 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-content/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.209620 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-utilities/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.320373 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-utilities/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.360488 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/extract-content/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.571892 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-utilities/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.858092 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-content/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.866946 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-content/0.log" Feb 18 07:19:19 crc kubenswrapper[4707]: I0218 07:19:19.902921 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-utilities/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.128985 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsz6w_2c3ae920-45c5-4b49-aed2-d651c4de9499/registry-server/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.140042 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-utilities/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.183048 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/extract-content/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.353423 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/util/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.538449 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-56pwk_abf2b1e9-91f8-41a4-95b4-a14e4af58f6f/registry-server/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.600241 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/util/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.634629 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/pull/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.637356 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/pull/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.792254 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/util/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.808848 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/extract/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.838681 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatvf7n_250c6074-0914-405f-ac9f-59f2d01c6cf1/pull/0.log" Feb 18 07:19:20 crc kubenswrapper[4707]: I0218 07:19:20.975607 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z7xmc_ef5bee5f-c0c3-471e-88fb-43735b7c0b31/marketplace-operator/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.001566 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-utilities/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.182911 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-content/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.183428 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-content/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.238155 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-utilities/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.353946 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-utilities/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.382968 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/extract-content/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.545009 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kptw6_c918c8de-d428-484d-910e-c513ed5db3b9/registry-server/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.631028 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-utilities/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.853108 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-content/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.856432 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-content/0.log" Feb 18 07:19:21 crc kubenswrapper[4707]: I0218 07:19:21.857451 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-utilities/0.log" Feb 18 07:19:22 crc kubenswrapper[4707]: I0218 07:19:22.026440 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-utilities/0.log" Feb 18 07:19:22 crc kubenswrapper[4707]: I0218 07:19:22.094779 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/extract-content/0.log" Feb 18 07:19:22 crc kubenswrapper[4707]: I0218 07:19:22.827096 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zprxl_9a253b57-9570-4ed9-8d7a-acb1733f9db2/registry-server/0.log" Feb 18 07:19:56 crc kubenswrapper[4707]: E0218 07:19:56.627994 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.17:47640->38.102.83.17:43371: write tcp 38.102.83.17:47640->38.102.83.17:43371: write: broken pipe Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.382223 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.382754 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.569318 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k8qlb"] Feb 18 07:20:21 crc kubenswrapper[4707]: E0218 07:20:21.569792 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec745b7b-f620-45c6-9b0b-613c7d299f21" containerName="container-00" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.569830 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec745b7b-f620-45c6-9b0b-613c7d299f21" containerName="container-00" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.570066 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec745b7b-f620-45c6-9b0b-613c7d299f21" containerName="container-00" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.571822 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.581959 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8qlb"] Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.706771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-catalog-content\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.706837 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-utilities\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.707276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4jvf\" (UniqueName: \"kubernetes.io/projected/6d5cf56a-6879-465b-99e8-50d1c149b937-kube-api-access-t4jvf\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.809073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-catalog-content\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.809126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-utilities\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.809285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4jvf\" (UniqueName: \"kubernetes.io/projected/6d5cf56a-6879-465b-99e8-50d1c149b937-kube-api-access-t4jvf\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.809599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-catalog-content\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.809668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-utilities\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.827706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4jvf\" (UniqueName: \"kubernetes.io/projected/6d5cf56a-6879-465b-99e8-50d1c149b937-kube-api-access-t4jvf\") pod \"community-operators-k8qlb\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:21 crc kubenswrapper[4707]: I0218 07:20:21.903912 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:22 crc kubenswrapper[4707]: I0218 07:20:22.431453 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8qlb"] Feb 18 07:20:23 crc kubenswrapper[4707]: I0218 07:20:23.055484 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerID="ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45" exitCode=0 Feb 18 07:20:23 crc kubenswrapper[4707]: I0218 07:20:23.056001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8qlb" event={"ID":"6d5cf56a-6879-465b-99e8-50d1c149b937","Type":"ContainerDied","Data":"ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45"} Feb 18 07:20:23 crc kubenswrapper[4707]: I0218 07:20:23.056076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8qlb" event={"ID":"6d5cf56a-6879-465b-99e8-50d1c149b937","Type":"ContainerStarted","Data":"f5979c1522bf632571855cadfb33846db14d40fb6a2a973239bb83f326b7534f"} Feb 18 07:20:23 crc kubenswrapper[4707]: I0218 07:20:23.058970 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 07:20:24 crc kubenswrapper[4707]: I0218 07:20:24.070529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8qlb" event={"ID":"6d5cf56a-6879-465b-99e8-50d1c149b937","Type":"ContainerStarted","Data":"c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f"} Feb 18 07:20:25 crc kubenswrapper[4707]: I0218 07:20:25.082381 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerID="c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f" exitCode=0 Feb 18 07:20:25 crc kubenswrapper[4707]: I0218 07:20:25.082452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8qlb" event={"ID":"6d5cf56a-6879-465b-99e8-50d1c149b937","Type":"ContainerDied","Data":"c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f"} Feb 18 07:20:26 crc kubenswrapper[4707]: I0218 07:20:26.093202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8qlb" event={"ID":"6d5cf56a-6879-465b-99e8-50d1c149b937","Type":"ContainerStarted","Data":"c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17"} Feb 18 07:20:26 crc kubenswrapper[4707]: I0218 07:20:26.116875 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k8qlb" podStartSLOduration=2.380298702 podStartE2EDuration="5.116852919s" podCreationTimestamp="2026-02-18 07:20:21 +0000 UTC" firstStartedPulling="2026-02-18 07:20:23.058663318 +0000 UTC m=+5559.706622452" lastFinishedPulling="2026-02-18 07:20:25.795217535 +0000 UTC m=+5562.443176669" observedRunningTime="2026-02-18 07:20:26.108235105 +0000 UTC m=+5562.756194239" watchObservedRunningTime="2026-02-18 07:20:26.116852919 +0000 UTC m=+5562.764812053" Feb 18 07:20:31 crc kubenswrapper[4707]: I0218 07:20:31.904901 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:31 crc kubenswrapper[4707]: I0218 07:20:31.906464 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:31 crc kubenswrapper[4707]: I0218 07:20:31.956348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:32 crc kubenswrapper[4707]: I0218 07:20:32.200413 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:32 crc kubenswrapper[4707]: I0218 07:20:32.259585 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8qlb"] Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.164185 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k8qlb" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="registry-server" containerID="cri-o://c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17" gracePeriod=2 Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.642173 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.687887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-catalog-content\") pod \"6d5cf56a-6879-465b-99e8-50d1c149b937\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.688266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4jvf\" (UniqueName: \"kubernetes.io/projected/6d5cf56a-6879-465b-99e8-50d1c149b937-kube-api-access-t4jvf\") pod \"6d5cf56a-6879-465b-99e8-50d1c149b937\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.688389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-utilities\") pod \"6d5cf56a-6879-465b-99e8-50d1c149b937\" (UID: \"6d5cf56a-6879-465b-99e8-50d1c149b937\") " Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.689120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-utilities" (OuterVolumeSpecName: "utilities") pod "6d5cf56a-6879-465b-99e8-50d1c149b937" (UID: "6d5cf56a-6879-465b-99e8-50d1c149b937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.694537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5cf56a-6879-465b-99e8-50d1c149b937-kube-api-access-t4jvf" (OuterVolumeSpecName: "kube-api-access-t4jvf") pod "6d5cf56a-6879-465b-99e8-50d1c149b937" (UID: "6d5cf56a-6879-465b-99e8-50d1c149b937"). InnerVolumeSpecName "kube-api-access-t4jvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.790916 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:20:34 crc kubenswrapper[4707]: I0218 07:20:34.790974 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4jvf\" (UniqueName: \"kubernetes.io/projected/6d5cf56a-6879-465b-99e8-50d1c149b937-kube-api-access-t4jvf\") on node \"crc\" DevicePath \"\"" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.174569 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerID="c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17" exitCode=0 Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.174631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8qlb" event={"ID":"6d5cf56a-6879-465b-99e8-50d1c149b937","Type":"ContainerDied","Data":"c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17"} Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.174639 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8qlb" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.174675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8qlb" event={"ID":"6d5cf56a-6879-465b-99e8-50d1c149b937","Type":"ContainerDied","Data":"f5979c1522bf632571855cadfb33846db14d40fb6a2a973239bb83f326b7534f"} Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.174701 4707 scope.go:117] "RemoveContainer" containerID="c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.183304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d5cf56a-6879-465b-99e8-50d1c149b937" (UID: "6d5cf56a-6879-465b-99e8-50d1c149b937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.194466 4707 scope.go:117] "RemoveContainer" containerID="c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.199718 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf56a-6879-465b-99e8-50d1c149b937-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.221226 4707 scope.go:117] "RemoveContainer" containerID="ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.258779 4707 scope.go:117] "RemoveContainer" containerID="c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17" Feb 18 07:20:35 crc kubenswrapper[4707]: E0218 07:20:35.259343 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17\": container with ID starting with c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17 not found: ID does not exist" containerID="c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.259379 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17"} err="failed to get container status \"c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17\": rpc error: code = NotFound desc = could not find container \"c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17\": container with ID starting with c5d1739db392d9c0521efbf847c6a7327edfbac4159a352243cd54ba3a548e17 not found: ID does not exist" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.259405 4707 scope.go:117] "RemoveContainer" containerID="c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f" Feb 18 07:20:35 crc kubenswrapper[4707]: E0218 07:20:35.259779 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f\": container with ID starting with c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f not found: ID does not exist" containerID="c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.259831 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f"} err="failed to get container status \"c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f\": rpc error: code = NotFound desc = could not find container \"c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f\": container with ID starting with c5c1ba853437bc5e63ecaa4262da847f9b4edd2a04f623076532cbb9e3923f9f not found: ID does not exist" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.259857 4707 scope.go:117] "RemoveContainer" containerID="ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45" Feb 18 07:20:35 crc kubenswrapper[4707]: E0218 07:20:35.260138 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45\": container with ID starting with ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45 not found: ID does not exist" containerID="ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.260171 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45"} err="failed to get container status \"ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45\": rpc error: code = NotFound desc = could not find container \"ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45\": container with ID starting with ab0afdfd6980fe5bae007c42b4cc7679a563ed7576568b32b8b86e1613ca9d45 not found: ID does not exist" Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.511826 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8qlb"] Feb 18 07:20:35 crc kubenswrapper[4707]: I0218 07:20:35.527338 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k8qlb"] Feb 18 07:20:36 crc kubenswrapper[4707]: I0218 07:20:36.070346 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" path="/var/lib/kubelet/pods/6d5cf56a-6879-465b-99e8-50d1c149b937/volumes" Feb 18 07:20:51 crc kubenswrapper[4707]: I0218 07:20:51.382370 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:20:51 crc kubenswrapper[4707]: I0218 07:20:51.382909 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.841838 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mrk9s"] Feb 18 07:21:16 crc kubenswrapper[4707]: E0218 07:21:16.843372 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="extract-content" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.843393 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="extract-content" Feb 18 07:21:16 crc kubenswrapper[4707]: E0218 07:21:16.843428 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="extract-utilities" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.843435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="extract-utilities" Feb 18 07:21:16 crc kubenswrapper[4707]: E0218 07:21:16.843447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="registry-server" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.843454 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="registry-server" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.843704 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5cf56a-6879-465b-99e8-50d1c149b937" containerName="registry-server" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.845650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.857018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrk9s"] Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.967166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-utilities\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.967493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-catalog-content\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:16 crc kubenswrapper[4707]: I0218 07:21:16.967543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd2sb\" (UniqueName: \"kubernetes.io/projected/79d6a129-35a0-49e3-9711-dc6e9ca19189-kube-api-access-nd2sb\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.039904 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5s9qh"] Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.044838 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.049134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s9qh"] Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.069271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-utilities\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.069314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-catalog-content\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.069346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd2sb\" (UniqueName: \"kubernetes.io/projected/79d6a129-35a0-49e3-9711-dc6e9ca19189-kube-api-access-nd2sb\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.069721 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-utilities\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.069826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-catalog-content\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.126654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd2sb\" (UniqueName: \"kubernetes.io/projected/79d6a129-35a0-49e3-9711-dc6e9ca19189-kube-api-access-nd2sb\") pod \"redhat-operators-mrk9s\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.171891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw88l\" (UniqueName: \"kubernetes.io/projected/b831ec46-4768-4177-b077-8fbf33d23a85-kube-api-access-vw88l\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.172241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-catalog-content\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.173023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-utilities\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.175904 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.274915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw88l\" (UniqueName: \"kubernetes.io/projected/b831ec46-4768-4177-b077-8fbf33d23a85-kube-api-access-vw88l\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.275011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-catalog-content\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.275108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-utilities\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.275688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-utilities\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.276207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-catalog-content\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.294585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw88l\" (UniqueName: \"kubernetes.io/projected/b831ec46-4768-4177-b077-8fbf33d23a85-kube-api-access-vw88l\") pod \"redhat-marketplace-5s9qh\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.368957 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.785132 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrk9s"] Feb 18 07:21:17 crc kubenswrapper[4707]: I0218 07:21:17.963289 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s9qh"] Feb 18 07:21:18 crc kubenswrapper[4707]: I0218 07:21:18.558871 4707 generic.go:334] "Generic (PLEG): container finished" podID="b831ec46-4768-4177-b077-8fbf33d23a85" containerID="70137004a17edb06f4beefa6b837a1149e98183c5e820f7fd7041d15e710e7bb" exitCode=0 Feb 18 07:21:18 crc kubenswrapper[4707]: I0218 07:21:18.559175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s9qh" event={"ID":"b831ec46-4768-4177-b077-8fbf33d23a85","Type":"ContainerDied","Data":"70137004a17edb06f4beefa6b837a1149e98183c5e820f7fd7041d15e710e7bb"} Feb 18 07:21:18 crc kubenswrapper[4707]: I0218 07:21:18.559201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s9qh" event={"ID":"b831ec46-4768-4177-b077-8fbf33d23a85","Type":"ContainerStarted","Data":"586f86b574b1a321f7baba4373c4fbde74987d13be062bae1a95ce81ed60d919"} Feb 18 07:21:18 crc kubenswrapper[4707]: I0218 07:21:18.566184 4707 generic.go:334] "Generic (PLEG): container finished" podID="79d6a129-35a0-49e3-9711-dc6e9ca19189" containerID="60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd" exitCode=0 Feb 18 07:21:18 crc kubenswrapper[4707]: I0218 07:21:18.566222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrk9s" event={"ID":"79d6a129-35a0-49e3-9711-dc6e9ca19189","Type":"ContainerDied","Data":"60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd"} Feb 18 07:21:18 crc kubenswrapper[4707]: I0218 07:21:18.566244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrk9s" event={"ID":"79d6a129-35a0-49e3-9711-dc6e9ca19189","Type":"ContainerStarted","Data":"48fe17e00d6cbb94a353a324642fbc9382db64c1ba374fbba71c5636f5cd1b5e"} Feb 18 07:21:19 crc kubenswrapper[4707]: I0218 07:21:19.586062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrk9s" event={"ID":"79d6a129-35a0-49e3-9711-dc6e9ca19189","Type":"ContainerStarted","Data":"b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd"} Feb 18 07:21:19 crc kubenswrapper[4707]: I0218 07:21:19.588964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s9qh" event={"ID":"b831ec46-4768-4177-b077-8fbf33d23a85","Type":"ContainerStarted","Data":"26e580ba0b901c91cbe4fe75c70465b780667419dcd48c3b80d11d2fd6dca9f7"} Feb 18 07:21:20 crc kubenswrapper[4707]: I0218 07:21:20.601612 4707 generic.go:334] "Generic (PLEG): container finished" podID="b831ec46-4768-4177-b077-8fbf33d23a85" containerID="26e580ba0b901c91cbe4fe75c70465b780667419dcd48c3b80d11d2fd6dca9f7" exitCode=0 Feb 18 07:21:20 crc kubenswrapper[4707]: I0218 07:21:20.601721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s9qh" event={"ID":"b831ec46-4768-4177-b077-8fbf33d23a85","Type":"ContainerDied","Data":"26e580ba0b901c91cbe4fe75c70465b780667419dcd48c3b80d11d2fd6dca9f7"} Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.382441 4707 patch_prober.go:28] interesting pod/machine-config-daemon-sbhs6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.382811 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.382860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.383664 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88"} pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.383720 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerName="machine-config-daemon" containerID="cri-o://e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" gracePeriod=600 Feb 18 07:21:21 crc kubenswrapper[4707]: E0218 07:21:21.508283 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.617187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s9qh" event={"ID":"b831ec46-4768-4177-b077-8fbf33d23a85","Type":"ContainerStarted","Data":"e11c2c9e6c0fe698ee10cbc53101f319e66fc1f956d1cc421560a7671519b842"} Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.620704 4707 generic.go:334] "Generic (PLEG): container finished" podID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" exitCode=0 Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.620774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerDied","Data":"e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88"} Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.620853 4707 scope.go:117] "RemoveContainer" containerID="285a82f9ab065dc06e8aad4a395d2035da5abd7059a2b1dc1270dfccd3ef7a66" Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.622155 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:21:21 crc kubenswrapper[4707]: E0218 07:21:21.622634 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.627951 4707 generic.go:334] "Generic (PLEG): container finished" podID="79d6a129-35a0-49e3-9711-dc6e9ca19189" containerID="b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd" exitCode=0 Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.628031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrk9s" event={"ID":"79d6a129-35a0-49e3-9711-dc6e9ca19189","Type":"ContainerDied","Data":"b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd"} Feb 18 07:21:21 crc kubenswrapper[4707]: I0218 07:21:21.650256 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5s9qh" podStartSLOduration=2.152509936 podStartE2EDuration="4.650236304s" podCreationTimestamp="2026-02-18 07:21:17 +0000 UTC" firstStartedPulling="2026-02-18 07:21:18.56210696 +0000 UTC m=+5615.210066084" lastFinishedPulling="2026-02-18 07:21:21.059833318 +0000 UTC m=+5617.707792452" observedRunningTime="2026-02-18 07:21:21.645700291 +0000 UTC m=+5618.293659426" watchObservedRunningTime="2026-02-18 07:21:21.650236304 +0000 UTC m=+5618.298195448" Feb 18 07:21:22 crc kubenswrapper[4707]: I0218 07:21:22.641291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrk9s" event={"ID":"79d6a129-35a0-49e3-9711-dc6e9ca19189","Type":"ContainerStarted","Data":"ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535"} Feb 18 07:21:22 crc kubenswrapper[4707]: I0218 07:21:22.670442 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mrk9s" podStartSLOduration=3.201283621 podStartE2EDuration="6.67042388s" podCreationTimestamp="2026-02-18 07:21:16 +0000 UTC" firstStartedPulling="2026-02-18 07:21:18.567884528 +0000 UTC m=+5615.215843662" lastFinishedPulling="2026-02-18 07:21:22.037024787 +0000 UTC m=+5618.684983921" observedRunningTime="2026-02-18 07:21:22.662588597 +0000 UTC m=+5619.310547741" watchObservedRunningTime="2026-02-18 07:21:22.67042388 +0000 UTC m=+5619.318383014" Feb 18 07:21:27 crc kubenswrapper[4707]: I0218 07:21:27.176639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:27 crc kubenswrapper[4707]: I0218 07:21:27.177339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:27 crc kubenswrapper[4707]: I0218 07:21:27.372141 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:27 crc kubenswrapper[4707]: I0218 07:21:27.372562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:27 crc kubenswrapper[4707]: I0218 07:21:27.436430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:27 crc kubenswrapper[4707]: I0218 07:21:27.755855 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:27 crc kubenswrapper[4707]: I0218 07:21:27.815072 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s9qh"] Feb 18 07:21:28 crc kubenswrapper[4707]: I0218 07:21:28.237221 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mrk9s" podUID="79d6a129-35a0-49e3-9711-dc6e9ca19189" containerName="registry-server" probeResult="failure" output=< Feb 18 07:21:28 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Feb 18 07:21:28 crc kubenswrapper[4707]: > Feb 18 07:21:29 crc kubenswrapper[4707]: I0218 07:21:29.701868 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5s9qh" podUID="b831ec46-4768-4177-b077-8fbf33d23a85" containerName="registry-server" containerID="cri-o://e11c2c9e6c0fe698ee10cbc53101f319e66fc1f956d1cc421560a7671519b842" gracePeriod=2 Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.712612 4707 generic.go:334] "Generic (PLEG): container finished" podID="b831ec46-4768-4177-b077-8fbf33d23a85" containerID="e11c2c9e6c0fe698ee10cbc53101f319e66fc1f956d1cc421560a7671519b842" exitCode=0 Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.712655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s9qh" event={"ID":"b831ec46-4768-4177-b077-8fbf33d23a85","Type":"ContainerDied","Data":"e11c2c9e6c0fe698ee10cbc53101f319e66fc1f956d1cc421560a7671519b842"} Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.712730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s9qh" event={"ID":"b831ec46-4768-4177-b077-8fbf33d23a85","Type":"ContainerDied","Data":"586f86b574b1a321f7baba4373c4fbde74987d13be062bae1a95ce81ed60d919"} Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.712741 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="586f86b574b1a321f7baba4373c4fbde74987d13be062bae1a95ce81ed60d919" Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.727872 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.885117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw88l\" (UniqueName: \"kubernetes.io/projected/b831ec46-4768-4177-b077-8fbf33d23a85-kube-api-access-vw88l\") pod \"b831ec46-4768-4177-b077-8fbf33d23a85\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.885177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-utilities\") pod \"b831ec46-4768-4177-b077-8fbf33d23a85\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.885396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-catalog-content\") pod \"b831ec46-4768-4177-b077-8fbf33d23a85\" (UID: \"b831ec46-4768-4177-b077-8fbf33d23a85\") " Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.887704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-utilities" (OuterVolumeSpecName: "utilities") pod "b831ec46-4768-4177-b077-8fbf33d23a85" (UID: "b831ec46-4768-4177-b077-8fbf33d23a85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.900958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b831ec46-4768-4177-b077-8fbf33d23a85-kube-api-access-vw88l" (OuterVolumeSpecName: "kube-api-access-vw88l") pod "b831ec46-4768-4177-b077-8fbf33d23a85" (UID: "b831ec46-4768-4177-b077-8fbf33d23a85"). InnerVolumeSpecName "kube-api-access-vw88l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.909302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b831ec46-4768-4177-b077-8fbf33d23a85" (UID: "b831ec46-4768-4177-b077-8fbf33d23a85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.987346 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.987378 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw88l\" (UniqueName: \"kubernetes.io/projected/b831ec46-4768-4177-b077-8fbf33d23a85-kube-api-access-vw88l\") on node \"crc\" DevicePath \"\"" Feb 18 07:21:30 crc kubenswrapper[4707]: I0218 07:21:30.987388 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b831ec46-4768-4177-b077-8fbf33d23a85-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:21:31 crc kubenswrapper[4707]: I0218 07:21:31.721299 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s9qh" Feb 18 07:21:31 crc kubenswrapper[4707]: I0218 07:21:31.768857 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s9qh"] Feb 18 07:21:31 crc kubenswrapper[4707]: I0218 07:21:31.778420 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s9qh"] Feb 18 07:21:32 crc kubenswrapper[4707]: I0218 07:21:32.063887 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b831ec46-4768-4177-b077-8fbf33d23a85" path="/var/lib/kubelet/pods/b831ec46-4768-4177-b077-8fbf33d23a85/volumes" Feb 18 07:21:36 crc kubenswrapper[4707]: I0218 07:21:36.053955 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:21:36 crc kubenswrapper[4707]: E0218 07:21:36.054786 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:21:37 crc kubenswrapper[4707]: I0218 07:21:37.224398 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:37 crc kubenswrapper[4707]: I0218 07:21:37.272509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:38 crc kubenswrapper[4707]: I0218 07:21:38.274226 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrk9s"] Feb 18 07:21:38 crc kubenswrapper[4707]: I0218 07:21:38.778138 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mrk9s" podUID="79d6a129-35a0-49e3-9711-dc6e9ca19189" containerName="registry-server" containerID="cri-o://ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535" gracePeriod=2 Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.304009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.449834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd2sb\" (UniqueName: \"kubernetes.io/projected/79d6a129-35a0-49e3-9711-dc6e9ca19189-kube-api-access-nd2sb\") pod \"79d6a129-35a0-49e3-9711-dc6e9ca19189\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.449903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-catalog-content\") pod \"79d6a129-35a0-49e3-9711-dc6e9ca19189\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.449987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-utilities\") pod \"79d6a129-35a0-49e3-9711-dc6e9ca19189\" (UID: \"79d6a129-35a0-49e3-9711-dc6e9ca19189\") " Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.451304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-utilities" (OuterVolumeSpecName: "utilities") pod "79d6a129-35a0-49e3-9711-dc6e9ca19189" (UID: "79d6a129-35a0-49e3-9711-dc6e9ca19189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.455097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d6a129-35a0-49e3-9711-dc6e9ca19189-kube-api-access-nd2sb" (OuterVolumeSpecName: "kube-api-access-nd2sb") pod "79d6a129-35a0-49e3-9711-dc6e9ca19189" (UID: "79d6a129-35a0-49e3-9711-dc6e9ca19189"). InnerVolumeSpecName "kube-api-access-nd2sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.554446 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd2sb\" (UniqueName: \"kubernetes.io/projected/79d6a129-35a0-49e3-9711-dc6e9ca19189-kube-api-access-nd2sb\") on node \"crc\" DevicePath \"\"" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.554506 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.590674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79d6a129-35a0-49e3-9711-dc6e9ca19189" (UID: "79d6a129-35a0-49e3-9711-dc6e9ca19189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.656590 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d6a129-35a0-49e3-9711-dc6e9ca19189-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.790923 4707 generic.go:334] "Generic (PLEG): container finished" podID="79d6a129-35a0-49e3-9711-dc6e9ca19189" containerID="ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535" exitCode=0 Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.790965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrk9s" event={"ID":"79d6a129-35a0-49e3-9711-dc6e9ca19189","Type":"ContainerDied","Data":"ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535"} Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.790991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrk9s" event={"ID":"79d6a129-35a0-49e3-9711-dc6e9ca19189","Type":"ContainerDied","Data":"48fe17e00d6cbb94a353a324642fbc9382db64c1ba374fbba71c5636f5cd1b5e"} Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.791011 4707 scope.go:117] "RemoveContainer" containerID="ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.791154 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrk9s" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.814283 4707 scope.go:117] "RemoveContainer" containerID="b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.845100 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrk9s"] Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.849974 4707 scope.go:117] "RemoveContainer" containerID="60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.857226 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mrk9s"] Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.908574 4707 scope.go:117] "RemoveContainer" containerID="ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535" Feb 18 07:21:39 crc kubenswrapper[4707]: E0218 07:21:39.909052 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535\": container with ID starting with ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535 not found: ID does not exist" containerID="ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.909100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535"} err="failed to get container status \"ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535\": rpc error: code = NotFound desc = could not find container \"ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535\": container with ID starting with ac6af0abcad5aa42be83852a6c7c5b1fbfd87a93dcb5e07133e36f6f1d925535 not found: ID does not exist" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.909132 4707 scope.go:117] "RemoveContainer" containerID="b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd" Feb 18 07:21:39 crc kubenswrapper[4707]: E0218 07:21:39.909452 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd\": container with ID starting with b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd not found: ID does not exist" containerID="b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.909476 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd"} err="failed to get container status \"b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd\": rpc error: code = NotFound desc = could not find container \"b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd\": container with ID starting with b582f843fcd5a269e2fa57ab9ba8f43236579f22fa749191f5fc7775e9d128fd not found: ID does not exist" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.909491 4707 scope.go:117] "RemoveContainer" containerID="60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd" Feb 18 07:21:39 crc kubenswrapper[4707]: E0218 07:21:39.909777 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd\": container with ID starting with 60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd not found: ID does not exist" containerID="60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd" Feb 18 07:21:39 crc kubenswrapper[4707]: I0218 07:21:39.909846 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd"} err="failed to get container status \"60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd\": rpc error: code = NotFound desc = could not find container \"60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd\": container with ID starting with 60bd7f31f13a465f476766bcac03d23e5b83ef048c8b9588c09425d3882c8afd not found: ID does not exist" Feb 18 07:21:40 crc kubenswrapper[4707]: I0218 07:21:40.066639 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d6a129-35a0-49e3-9711-dc6e9ca19189" path="/var/lib/kubelet/pods/79d6a129-35a0-49e3-9711-dc6e9ca19189/volumes" Feb 18 07:21:48 crc kubenswrapper[4707]: I0218 07:21:48.053561 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:21:48 crc kubenswrapper[4707]: E0218 07:21:48.054292 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:21:53 crc kubenswrapper[4707]: I0218 07:21:53.912112 4707 generic.go:334] "Generic (PLEG): container finished" podID="ff999da3-3e5d-4483-81d7-721897325f90" containerID="dee44fba5532e5320af1c501dc94fe6dd710968646ba6a45e9d03fbb566744fd" exitCode=0 Feb 18 07:21:53 crc kubenswrapper[4707]: I0218 07:21:53.912195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" event={"ID":"ff999da3-3e5d-4483-81d7-721897325f90","Type":"ContainerDied","Data":"dee44fba5532e5320af1c501dc94fe6dd710968646ba6a45e9d03fbb566744fd"} Feb 18 07:21:53 crc kubenswrapper[4707]: I0218 07:21:53.913465 4707 scope.go:117] "RemoveContainer" containerID="dee44fba5532e5320af1c501dc94fe6dd710968646ba6a45e9d03fbb566744fd" Feb 18 07:21:54 crc kubenswrapper[4707]: I0218 07:21:54.706744 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hqdv7_must-gather-gn9z7_ff999da3-3e5d-4483-81d7-721897325f90/gather/0.log" Feb 18 07:22:03 crc kubenswrapper[4707]: I0218 07:22:03.053269 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:22:03 crc kubenswrapper[4707]: E0218 07:22:03.054088 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:22:05 crc kubenswrapper[4707]: I0218 07:22:05.570326 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hqdv7/must-gather-gn9z7"] Feb 18 07:22:05 crc kubenswrapper[4707]: I0218 07:22:05.571418 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" podUID="ff999da3-3e5d-4483-81d7-721897325f90" containerName="copy" containerID="cri-o://b60446957220bf223cce9f544188f1acedb4224c6e95f7e3d46beaea45fb343b" gracePeriod=2 Feb 18 07:22:05 crc kubenswrapper[4707]: I0218 07:22:05.583382 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hqdv7/must-gather-gn9z7"] Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.016631 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hqdv7_must-gather-gn9z7_ff999da3-3e5d-4483-81d7-721897325f90/copy/0.log" Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.017317 4707 generic.go:334] "Generic (PLEG): container finished" podID="ff999da3-3e5d-4483-81d7-721897325f90" containerID="b60446957220bf223cce9f544188f1acedb4224c6e95f7e3d46beaea45fb343b" exitCode=143 Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.017371 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="321f23fec38d9657191ba0ad61867c31c721d4f836b21ff60086352aafe8182d" Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.079248 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hqdv7_must-gather-gn9z7_ff999da3-3e5d-4483-81d7-721897325f90/copy/0.log" Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.079601 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.172999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57989\" (UniqueName: \"kubernetes.io/projected/ff999da3-3e5d-4483-81d7-721897325f90-kube-api-access-57989\") pod \"ff999da3-3e5d-4483-81d7-721897325f90\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.173071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff999da3-3e5d-4483-81d7-721897325f90-must-gather-output\") pod \"ff999da3-3e5d-4483-81d7-721897325f90\" (UID: \"ff999da3-3e5d-4483-81d7-721897325f90\") " Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.182108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff999da3-3e5d-4483-81d7-721897325f90-kube-api-access-57989" (OuterVolumeSpecName: "kube-api-access-57989") pod "ff999da3-3e5d-4483-81d7-721897325f90" (UID: "ff999da3-3e5d-4483-81d7-721897325f90"). InnerVolumeSpecName "kube-api-access-57989". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.274931 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57989\" (UniqueName: \"kubernetes.io/projected/ff999da3-3e5d-4483-81d7-721897325f90-kube-api-access-57989\") on node \"crc\" DevicePath \"\"" Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.374390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff999da3-3e5d-4483-81d7-721897325f90-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ff999da3-3e5d-4483-81d7-721897325f90" (UID: "ff999da3-3e5d-4483-81d7-721897325f90"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 07:22:06 crc kubenswrapper[4707]: I0218 07:22:06.377699 4707 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff999da3-3e5d-4483-81d7-721897325f90-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 07:22:07 crc kubenswrapper[4707]: I0218 07:22:07.024508 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hqdv7/must-gather-gn9z7" Feb 18 07:22:08 crc kubenswrapper[4707]: I0218 07:22:08.066611 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff999da3-3e5d-4483-81d7-721897325f90" path="/var/lib/kubelet/pods/ff999da3-3e5d-4483-81d7-721897325f90/volumes" Feb 18 07:22:15 crc kubenswrapper[4707]: I0218 07:22:15.052827 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:22:15 crc kubenswrapper[4707]: E0218 07:22:15.053656 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:22:27 crc kubenswrapper[4707]: I0218 07:22:27.053460 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:22:27 crc kubenswrapper[4707]: E0218 07:22:27.054192 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:22:39 crc kubenswrapper[4707]: I0218 07:22:39.052834 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:22:39 crc kubenswrapper[4707]: E0218 07:22:39.053638 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:22:54 crc kubenswrapper[4707]: I0218 07:22:54.060373 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:22:54 crc kubenswrapper[4707]: E0218 07:22:54.063667 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:22:57 crc kubenswrapper[4707]: I0218 07:22:57.538012 4707 scope.go:117] "RemoveContainer" containerID="455e3b185dc665bfda1ad672dfa935aaf61a096ea80e107647a15fca0e382c1e" Feb 18 07:22:57 crc kubenswrapper[4707]: I0218 07:22:57.580269 4707 scope.go:117] "RemoveContainer" containerID="b60446957220bf223cce9f544188f1acedb4224c6e95f7e3d46beaea45fb343b" Feb 18 07:22:57 crc kubenswrapper[4707]: I0218 07:22:57.616915 4707 scope.go:117] "RemoveContainer" containerID="dee44fba5532e5320af1c501dc94fe6dd710968646ba6a45e9d03fbb566744fd" Feb 18 07:23:08 crc kubenswrapper[4707]: I0218 07:23:08.052984 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:23:08 crc kubenswrapper[4707]: E0218 07:23:08.053894 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:23:22 crc kubenswrapper[4707]: I0218 07:23:22.053278 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:23:22 crc kubenswrapper[4707]: E0218 07:23:22.054203 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:23:35 crc kubenswrapper[4707]: I0218 07:23:35.053688 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:23:35 crc kubenswrapper[4707]: E0218 07:23:35.054538 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:23:47 crc kubenswrapper[4707]: I0218 07:23:47.053907 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:23:47 crc kubenswrapper[4707]: E0218 07:23:47.054678 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:24:00 crc kubenswrapper[4707]: I0218 07:24:00.054661 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:24:00 crc kubenswrapper[4707]: E0218 07:24:00.055493 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:24:14 crc kubenswrapper[4707]: I0218 07:24:14.080684 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:24:14 crc kubenswrapper[4707]: E0218 07:24:14.081486 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:24:29 crc kubenswrapper[4707]: I0218 07:24:29.053306 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:24:29 crc kubenswrapper[4707]: E0218 07:24:29.054099 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:24:44 crc kubenswrapper[4707]: I0218 07:24:44.062316 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:24:44 crc kubenswrapper[4707]: E0218 07:24:44.063675 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:24:57 crc kubenswrapper[4707]: I0218 07:24:57.053364 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:24:57 crc kubenswrapper[4707]: E0218 07:24:57.054145 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:25:09 crc kubenswrapper[4707]: I0218 07:25:09.052899 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:25:09 crc kubenswrapper[4707]: E0218 07:25:09.053753 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:25:22 crc kubenswrapper[4707]: I0218 07:25:22.053451 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:25:22 crc kubenswrapper[4707]: E0218 07:25:22.054292 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:25:36 crc kubenswrapper[4707]: I0218 07:25:36.053460 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:25:36 crc kubenswrapper[4707]: E0218 07:25:36.054512 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:25:47 crc kubenswrapper[4707]: I0218 07:25:47.053602 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:25:47 crc kubenswrapper[4707]: E0218 07:25:47.055142 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:26:01 crc kubenswrapper[4707]: I0218 07:26:01.053555 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:26:01 crc kubenswrapper[4707]: E0218 07:26:01.054571 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:26:12 crc kubenswrapper[4707]: I0218 07:26:12.054305 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:26:12 crc kubenswrapper[4707]: E0218 07:26:12.055100 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbhs6_openshift-machine-config-operator(185c5347-f458-48a7-bcc8-0b0fcd7b4850)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" podUID="185c5347-f458-48a7-bcc8-0b0fcd7b4850" Feb 18 07:26:23 crc kubenswrapper[4707]: I0218 07:26:23.054265 4707 scope.go:117] "RemoveContainer" containerID="e333644161e26526ac569c81d804fdcd51a6fc372a0abe9f4c1db2fe912c2d88" Feb 18 07:26:23 crc kubenswrapper[4707]: I0218 07:26:23.298649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbhs6" event={"ID":"185c5347-f458-48a7-bcc8-0b0fcd7b4850","Type":"ContainerStarted","Data":"cf7705fbd41e55f1f5c65f58e015e8782f48098d619b223e438cd3ec2b114f46"}